专利摘要:
Operational management of an autonomous vehicle may include crossing, through an autonomous vehicle, a vehicle transport network. crossing the vehicle transport network may include receiving, from an autonomous vehicle sensor, sensor information corresponding to an external object within a defined distance from the autonomous vehicle, identifying a distinct vehicle operating scenario in response to receiving sensor information , instantiate a scenario-specific operational control evaluation module instance, where the scenario-specific operational control evaluation module instance is an instance of a scenario specific operational control evaluation module modeling the distinct vehicle operating scenario , receive a candidate vehicle control action from the scenario-specific operational control evaluation module instance, and traverse a portion of the vehicle transport network based on the candidate vehicle control action.
公开号:BR112019016266A2
申请号:R112019016266
申请日:2017-02-10
公开日:2020-04-07
发明作者:Wray Kyle;WITWICKI Stefan;Zilberstein Shlomo;Pedersen Liam
申请人:Nissan North America, Inc.;The University Of Massachusetts;
IPC主号:
专利说明:

“AUTONOMOUS VEHICLE OPERATIONAL MANAGEMENT CONTROL”
TECHNICAL FIELD [001] This disclosure concerns the operational management of autonomous vehicle and autonomous driving.
BACKGROUND [002] A vehicle, like an autonomous vehicle, can cross a part of a vehicle transport network. Crossing the part of the vehicle transport network may include generating or capturing, such as by means of a vehicle sensor, data, such as data representing an operating environment, or a part thereof, of the vehicle. Therefore, a system, method and apparatus for controlling autonomous vehicle operational management can be advantageous.
SUMMARY [003] In this document, aspects, resources, elements, implementations and modalities of control of autonomous vehicle operational management are revealed.
[004] One aspect of the disclosed modalities is a method for use when crossing a vehicle transport network, which may include crossing, by means of an autonomous vehicle, a vehicle transport network, in which crossing the vehicle transport network vehicles includes receiving, from an autonomous vehicle sensor, sensor information corresponding to an external object within a defined distance from the autonomous vehicle, identifying a distinct vehicle operating scenario in response to receiving sensor information, instantiating a module instance of scenario-specific operational control assessment, where the scenario-specific operational control assessment module instance is an instance of a scenario-specific operational control assessment module modeling the distinct vehicle operating scenario, receive a control action vehicle
Petition 870190075707, of 06/08/2019, p. 9/149
2/111 candidate of the instance of scenario specific operational control evaluation module, and cross a part of the vehicle transport network based on the candidate vehicle control action.
[005] Another aspect of the revealed modalities is a method for use when crossing a vehicle transport network, which may include crossing, through an autonomous vehicle, a vehicle transport network, in which to cross the transport network of vehicles includes generating an autonomous vehicle operational control environment to operate instances of scenario specific operational control assessment modules, where each instance of scenario specific operational control assessment module is an instance of a respective scenario-specific operational control of a plurality of scenario-specific operational control evaluation modules, where each scenario-specific operational control evaluation module models a respective vehicle operating scenario distinct from a plurality of different vehicle operating scenarios, and where each instance of the operational control assessment module specifies scenario specific generates a respective candidate vehicle control action responsive to the respective corresponding distinct vehicle operating scenario, receiving, from at least one sensor of a plurality of sensors of the autonomous vehicle, sensor information corresponding to one or more external objects within a defined distance from the autonomous vehicle, identify a first vehicle operating scenario different from the operational scenarios of different vehicles in response to receiving sensor information, instantiate a first instance of the evaluation module of specific operational control of the evaluation module instances scenario-specific operational control units based on a first external object of one or more external objects, where the first scenario-specific operational control evaluation module instance is an instance of a first scenario-specific operational control evaluation module
Petition 870190075707, of 06/08/2019, p. 10/149
3/111 scenario of the plurality of scenario-specific operational control assessment modules, the first scenario-specific operational control assessment module modeling the first distinct vehicle operating scenario, receiving a first candidate vehicle control action from the first instance of scenario-specific operational control assessment module, and cross a part of the vehicle transport network based on the first candidate vehicle control action.
[006] Another aspect of the revealed modalities is an autonomous vehicle for control of operational management of autonomous vehicle. The autonomous vehicle may include a processor configured to execute instructions stored on a computer-readable non-transitory medium to generate an autonomous vehicle operational control environment to operate instances of scenario-specific operational control assessment modules, where each module instance scenario-specific operational control assessment module is an instance of a respective scenario-specific operational control assessment module from a plurality of scenario-specific operational control assessment modules, in which each scenario-specific operational control assessment module models a respective vehicle operating scenario distinct from a plurality of different vehicle operating scenarios, and in which each instance of scenario specific operational control evaluation module generates a respective candidate vehicle control action responsive to the respective vehicle operating scenario corresponding distinct vehicle, receiving, from at least one sensor of a plurality of sensors of the autonomous vehicle, sensor information corresponding to one or more external objects within a defined distance of the autonomous vehicle, identifying a first vehicle operational scenario distinct from the operational scenarios of distinct vehicles in response to receiving sensor information, instantiating a first instance of control evaluation module
Petition 870190075707, of 06/08/2019, p. 11/149
4/111 scenario-specific operational instance of scenario-specific operational control assessment modules based on a first external object from one or more external objects, where the first instance of scenario-specific operational control assessment module is a instance of a first scenario-specific operational control evaluation module of the plurality of scenario-specific operational control evaluation modules, the first scenario-specific operational control evaluation module modeling the distinct vehicle operating scenario, receiving a first action of candidate vehicle control of the first instance of a scenario-specific operational control evaluation module, and control the autonomous vehicle to cross a part of the vehicle transport network based on the first candidate vehicle control action.
[007] Variations in these and other aspects, resources, elements, implementations and modalities of the methods, apparatus, procedures and algorithms revealed in this document are described in more detail below.
BRIEF DESCRIPTION OF THE DRAWINGS [008] The various aspects of the methods and apparatus disclosed in this document will become more apparent when referring to the examples provided in the following description and drawings, in which:
[009] Figure 1 is a diagram of an example of a vehicle in which the aspects, features and elements revealed in this document can be implemented;
[010] Figure 2 is a diagram of an example of a part of a vehicle transport and communication system in which the aspects, resources and elements revealed in this document can be implemented;
[011] Figure 3 is a diagram of a part of a vehicle transport network according to this disclosure;
Petition 870190075707, of 06/08/2019, p. 12/149
5/111 [012] Figure 4 is a diagram of an example of an autonomous vehicle operational management system according to the modalities of this disclosure;
[013] Figure 5 is a flow diagram of an example of an autonomous vehicle operational management according to the modalities of this disclosure;
[014] Figure 6 is a diagram of an example of a blocking scene according to the modalities of this disclosure;
[015] Figure 7 is a diagram of an example of a pedestrian scene including pedestrian scenarios according to the modalities of this disclosure;
[016] Figure 8 is a diagram of an example of a crossing scene including crossing scenarios according to the modalities of this disclosure; and [017] Figure 9 is a diagram of an example of a lane change scene including a lane change scenario according to the modalities of this disclosure.
DETAILED DESCRIPTION [018] A vehicle, such as an autonomous vehicle, or a semi-autonomous vehicle, can pass through part of a vehicle transport network. The vehicle may include one or more sensors and traversing the vehicle transport network may include the sensors generating or capturing sensor data, such as data corresponding to a vehicle's operating environment, or a part thereof. For example, sensor data may include information corresponding to one or more external objects, such as pedestrians, remote vehicles, other objects within the vehicle's operating environment, vehicle transport network geometry or a combination thereof.
[019] The autonomous vehicle may include an autonomous vehicle operational management system, which may include one or more operating environment monitors that can process operating environment information, such as
Petition 870190075707, of 06/08/2019, p. 13/149
6/111 as the sensor data, for the autonomous vehicle. Operating environment monitors may include a lockout monitor that can determine the likelihood of availability information for parts of the vehicle transport network that are spatio-temporal to the autonomous vehicle.
[020] The autonomous vehicle operational management system may include an autonomous vehicle operational management controller or executor, which can detect one or more operational scenarios, such as pedestrian scenarios, intersection scenarios, lane change scenarios bearing, or any other vehicle operating scenario or combination of vehicle operating scenarios, corresponding to external objects.
[021] The autonomous vehicle operational management system may include one or more scenario-specific operational control assessment modules. Each scenario-specific operational control assessment module can be a model, such as a Partially Observable Markov Decision Process (POMDP) model, for a respective operating scenario. The autonomous vehicle operational management controller can instantiate respective instances of the scenario specific operational control assessment modules in response to detecting the corresponding operational scenarios.
[022] The autonomous vehicle operational management controller can receive candidate vehicle control actions from the respective instances of scenario-specific operational control evaluation modules, can identify a vehicle control action from the candidate vehicle control actions, and can control the autonomous vehicle to traverse a part of the vehicle transport network according to the identified vehicle control action.
[023] Although described in this document with reference to an autonomous vehicle, the methods and apparatus described in this document can be implemented in any vehicle capable of autonomous or semi-autonomous operation.
Petition 870190075707, of 06/08/2019, p. 14/149
7/111
Although described with reference to a vehicle transport network, the methods and apparatus described in this document may include the autonomous vehicle operating in any area navigable by the vehicle.
[024] As used herein, the terminology "computer" or "computing device" includes any unit, or combination of units, capable of executing any method, or any part or parts thereof, disclosed in this document.
[025] As used in this document, the term “processor” indicates one or more processors, such as one or more special purpose processors, one or more digital signal processors, one or more microprocessors, one or more controllers, one or more more microcontrollers, one or more application processors, one or more Application Specific Integrated Circuits, one or more Application Specific Standard Products, one or more Field Programmable Door Arrays, any other type or combination of integrated circuits, one or more state machines or any combination thereof.
[026] As used in this document, the terminology “memory” indicates any media or device usable by computer or readable by computer that can tangibly contain, store, communicate or transport any information or signal that can be used by any processor or in connection with it. For example, a memory can be one or more read-only memories (ROM), one or more random access memories (RAM), one or more registers, low power double data rate (LPDDR) memories, one or more cache memories, one or more semiconductor memory devices, one or more magnetic media, one or more optical media, one or more magneto-optical media or any combination thereof.
[027] As used in this document, the terminology “instructions” may include directions or expressions for executing any method, or any part or
Petition 870190075707, of 06/08/2019, p. 15/149
8/111 parts of it, revealed in this document, and can be implemented in hardware, software or any combination thereof. For example, instructions can be implemented as information, such as a computer program, stored in memory that can be executed by a processor to execute any of the respective methods, algorithms, aspects or combinations thereof, as described in this document. In some modalities, instructions, or a part of them, can be implemented as a processor, or set of circuits, of special use that can include specialized hardware to execute any of the methods, algorithms, aspects or combinations thereof, as described in this document. In some implementations, parts of the instructions can be distributed across multiple processors on a single device or across multiple devices, which can communicate directly or over a network such as a local area network, a wide area network, the Internet or a combination of them.
[028] As used in this document, the terminology "example", "modality", "implementation", "aspect", "resource" or "element" indicates serving as an example, instance or illustration. Unless expressly stated, any example, modality, implementation, aspect, feature or element is independent of each other example, modality, implementation, aspect, feature or element and may be used in combination with any other example, modality, implementation, aspect, feature or element.
[029] As used in this document, the terminology "determine" and "identify", or any variations thereof, includes selecting, investigating, computing, searching, receiving, determining, establishing, obtaining or identifying or determining in any way whatsoever either using one or more of the devices shown and described in this document.
[030] As used in this document, the terminology “or” is proposed to
Petition 870190075707, of 06/08/2019, p. 16/149
9/111 means an inclusive “or” instead of an exclusive “or”. That is, unless otherwise specified, or clearly by context, “X includes A or B” indicates any of the natural inclusive permutations. That is, if X includes A; X includes B; or X includes both A and B, so "X includes A or B" is satisfied in any of the instances indicated above. In addition, the articles “one” and “one” as used in this application and the appended claims in general should be interpreted as meaning “one or more” unless otherwise specified or clearly by context to be directed for a singular form.
[031] Additionally, for simplicity of explanation, although the figures and descriptions in this document may include sequences or series of steps or stages, elements of the methods disclosed in this document may occur in several orders or concurrently. In addition, elements of the methods disclosed in this document may occur with other elements not presented and described explicitly in this document. In addition, not all elements of the methods described in this document may be required to implement a method in accordance with this disclosure. Although aspects, features and elements are described in this document in particular combinations, each aspect, feature or element can be used independently or in various combinations with or without other aspects, features and elements.
[032] Figure 1 is a diagram of an example of a vehicle in which the aspects, features and elements revealed in this document can be implemented. In some embodiments, a vehicle 1000 may include a chassis 1100, a power train 1200, a controller 1300, wheels 1400, or any other element or combination of elements in a vehicle. Although vehicle 1000 is shown to include four wheels 1400 for simplicity, any other propulsion device or devices, such as a propeller or wake, can be used. In figure 1, the lines connecting elements, such as the power train
Petition 870190075707, of 06/08/2019, p. 17/149
11/101
1200, controller 1300 and wheels 1400, indicate that information, such as data or control signals, power, such as electrical power or torque, or both information and power can be transmitted between the respective elements. For example, controller 1300 can receive power from power train 1200 and can communicate with power train 1200, wheels 1400, or both to control vehicle 1000, which can include accelerating, decelerating, steering or controlling otherwise vehicle 1000.
[033] The power train 1200 may include a power source 1210, a transmission 1220, a steering unit 1230, a driver 1240, or any other element or combination of elements in a power train, such as a suspension, a drive shaft, axles or an exhaust system. Although shown separately, the 1400 wheels can be included in the 1200 power train.
[034] The 1210 power source can include an engine, a battery or a combination thereof. The 1210 power source can be any device or combination of operating devices to provide energy, such as electrical energy, thermal energy or kinetic energy. For example, the 1210 power source can include an engine, such as an internal combustion engine, an electric motor, or a combination of an internal combustion engine and an electric motor, and can be operative to provide kinetic energy as a force drive for one or more of the 1400 wheels. In some embodiments, the 1210 power source may include a potential energy unit, such as one or more dry cell batteries, such as nickel-cadmium (NiCd), nickel-zinc (NiZn ), nickel metal hydride (NiMH), lithium ion (Li-ion); solar cells; fuel cells; or any other device capable of supplying power.
[035] Transmission 1220 can receive energy, such as kinetic energy, from power source 1210, and can transmit energy to wheels 1400 to provide
Petition 870190075707, of 06/08/2019, p. 18/149
11/111 a driving force. Transmission 1220 can be controlled by controller 1300, driver 1240 or both. Steering unit 1230 can be controlled by controller 1300, driver 1240 or both and can control wheels 1400 to steer the vehicle. The driver 1240 can receive signals from the controller 1300 and can drive or control the power source 1210, the transmission 1220, the steering unit 1230 or any combination thereof to operate the vehicle 1000.
[036] In some embodiments, the 1300 controller may include a 1310 location unit, a 1320 electronic communication unit, a 1330 processor, a 1340 memory, a 1350 user interface, a 1360 sensor, a 1370 electronic communication interface or any combination of them. Although shown as a single unit, any one or more elements of the 1300 controller can be integrated into any number of separate physical units. For example, the 1350 user interface and the 1330 processor can be integrated into a first physical unit and 1340 memory can be integrated into a second physical unit. Although not shown in figure 1, controller 1300 can include a power source, such as a battery. Although shown as separate elements, the location unit 1310, the electronic communication unit 1320, the processor 1330, the memory 1340, the user interface 1350, the sensor 1360, the electronic communication interface 1370 or any combination of them can be integrated into one or more electronic units, circuits or chips.
[037] In some embodiments, the 1330 processor may include any device or combination of devices capable of manipulating or processing a signal or other information existing or developed in the future, including optical processors, quantum processors, molecular processors or a combination thereof. For example, the 1330 processor may include one or more special purpose processors, one or more digital signal processors, one or more
Petition 870190075707, of 06/08/2019, p. 19/149
12/111 more microprocessors, one or more controllers, one or more microcontrollers, one or more integrated circuits, one or more Application Specific Integrated Circuits, one or more Field Programmable Gate Arrays, one or more programmable logic arrays, one or more programmable logic controllers, one or more state machines or any combination thereof. The 1330 processor can be operationally coupled to the location unit 1310, the memory 1340, the electronic communication interface 1370, the electronic communication unit 1320, the user interface 1350, the sensor 1360, the power train 1200 or any combination of the same. For example, the processor can be operationally coupled to the 1340 memory via a 1380 communication bus.
[038] Memory 1340 may include any non-transitory tangible media usable by computer or readable by computer, capable of, for example, containing, storing, transmitting or transporting machine-readable instructions, or any information associated with it, for use by the processor 1330 or in connection with it. The 1340 memory can be, for example, one or more solid state drives, one or more memory cards, one or more removable media, one or more read-only memories, one or more random access memories, one or more disks , including a hard disk, a floppy disk, an optical disk, a magnetic or optical card, or any type of non-transitory media suitable for storing electronic information, or any combination thereof.
[039] The 1370 communication interface can be a wireless antenna, as shown, a wired communication port, an optical communication port, or any other wired or wireless unit capable of interfacing with a wired or wireless electronic communication media 1500. Although figure 1 shows the 1370 communication interface communicating through a single communication link, a communication interface can be configured to
Petition 870190075707, of 06/08/2019, p. 20/149
13/111 communicate via multiple communication links. Although figure 1 shows a single 1370 communication interface, a vehicle can include any number of communication interfaces.
[040] The 1320 communication unit can be configured to transmit or receive signals via wired or wireless electronic communication media 1500, such as via the 1370 communication interface. Although not shown explicitly in figure 1, the 1320 communication unit can be configured to transmit or receive, or both, via any wired or wireless communication media, such as radio frequency (RF), ultraviolet (UV), visible light, optical fiber, wired or a combination of themselves. Although figure 1 shows a single communication unit 1320 and a single communication interface 1370, any number of communication units and any number of communication interfaces can be used. In some embodiments, the 1320 communication unit may include a dedicated short-range communications unit (DSRC), an on-board unit (OBU) or a combination thereof.
[041] Location unit 1310 can determine geolocation information, such as longitude, latitude, elevation, direction of travel or speed, from vehicle 1000. For example, the location unit may include a global positioning system (GPS) unit ), such as a unit of the National Association of Marine Electronics (NMEA) capable of Wide Area Augmentation System (WAAS), a triangulation unit via radio or a combination thereof. Location unit 1310 can be used to obtain information representing, for example, a current travel path of vehicle 1000, a current position of vehicle 1000 in two or three dimensions, a current angular orientation of vehicle 1000 or a combination thereof .
[042] The 1350 user interface can include any unit capable of connecting via a person interface, such as a virtual mini keyboard or
Petition 870190075707, of 06/08/2019, p. 21/149
14/111 physical, a touch sensitive surface, a display, a touch display, an upside-down display, a virtual display, an augmented reality display, a tactile display, a feature tracking device, such as a device eye tracking, a speaker, a microphone, a video camera, a sensor, a printer or any combination of them. The 1350 user interface can be operationally coupled to the 1330 processor, as shown, or to any other element of the 1300 controller. Although shown as a single unit, the 1350 user interface can include one or more physical units. For example, the 1350 user interface can include an audio interface to perform audio communication with a person, and a touch display to perform visual and touch-based communication with the person. In some embodiments, the 1350 user interface may include multiple displays, such as multiple physically separate units, multiple parts defined within a single physical unit, or a combination thereof.
[043] The 1360 sensor can include one or more sensors, such as a set of sensors, which can be operable to provide information that can be used to control the vehicle. The 1360 sensors can provide information regarding the vehicle's current operating characteristics. 1360 sensors can include, for example, a speed sensor, acceleration sensors, a steering angle sensor, traction related sensors, braking related sensors, steering wheel position sensors, eye tracking sensors, sensors of seating positions, or any sensor, or combination of sensors, that is operable to report information regarding some aspect of the current dynamic situation of the vehicle 1000.
[044] In some embodiments, the 1360 sensors may include sensors that are operable to obtain information regarding the physical environment surrounding the vehicle 1000. For example, one or more sensors can detect geometry and
Petition 870190075707, of 06/08/2019, p. 22/149
15/111 track obstacles, such as fixed obstacles, vehicles and pedestrians. In some embodiments, 1360 sensors may be or include one or more video cameras, laser detection systems, infrared detection systems, acoustic detection systems, or any other suitable type of vehicle environmental detection device, or combination of devices, known now or developed later. In some embodiments, sensors 1360 and location unit 1310 can be combined.
[045] Although not shown separately, in some embodiments, vehicle 1000 may include a trajectory controller. For example, the 1300 controller can include the path controller. The trajectory controller can be operable to obtain information describing a current state of vehicle 1000 and a planned route for vehicle 1000, and, based on this information, to determine and optimize a trajectory for vehicle 1000. In some embodiments, the controller trajectory can produce operable signals to control vehicle 1000 in such a way that vehicle 1000 follows the path that is determined by the trajectory controller. For example, the path controller output can be an optimized path that can be provided for the 1200 power train, the 1400 wheels, or both. In some embodiments, the optimized path can be control inputs such as a set of steering angles, with each steering angle corresponding to a point in time or a position. In some modalities, the optimized path can be one or more paths, lines, curves or a combination of them.
[046] One or more of the wheels 1400 can be a steered wheel, which can be pivoted to a steering angle under the control of the steering unit 1230, a propelled wheel, which can receive torque to propel the vehicle 1000 under the transmission control 1220, or a driven and propelled wheel that can steer and propel the vehicle 1000.
Petition 870190075707, of 06/08/2019, p. 23/149
16/111 [047] Although not shown in figure 1, a vehicle can include units, or elements not shown in figure 1, such as a wrap, a Bluetooth® module, a radio frequency modulated (FM) unit, a module Near Field Communication (NFC), a liquid crystal display (LCD) display unit, an organic light-emitting diode (OLED) display unit, a speaker or any combination thereof.
[048] In some embodiments, vehicle 1000 may be an autonomous vehicle. An autonomous vehicle can be controlled autonomously, without direct human intervention, to cross a part of a vehicle transport network. Although not shown separately in Figure 1, in some implementations, an autonomous vehicle may include an autonomous vehicle control unit, which can perform autonomous vehicle routing, navigation and control. In some implementations, the autonomous vehicle control unit can be integrated with another vehicle unit. For example, the 1300 controller can include the autonomous vehicle control unit.
[049] In some implementations, the autonomous vehicle control unit can control or operate vehicle 1000 to traverse a part of the vehicle transport network according to current vehicle operating parameters. In another example, the autonomous vehicle control unit can control or operate vehicle 1000 to perform a defined operation or maneuver, such as parking the vehicle. In another example, the autonomous vehicle control unit can generate a travel route from an origin, such as a current vehicle location 1000, to a destination based on vehicle information, environment information, transport network information vehicles representing the vehicle transport network or in a combination thereof, and can control or operate vehicle 1000 to traverse the vehicle transport network according to the route. For example, the vehicle control unit
Petition 870190075707, of 06/08/2019, p. 24/149
Autonomous 17/111 can send the travel route to a trajectory controller that can operate vehicle 1000 to travel from origin to destination using the generated route.
[050] Figure 2 is a diagram of an example of a part of a vehicle transport and communication system in which the aspects, resources and elements revealed in this document can be implemented. The vehicle transport and communication system 2000 can include one or more 2100/2110 vehicles, such as vehicle 1000 shown in figure 1, which can travel via one or more parts of one or more 2200 vehicle transport networks, and can communicate through one or more 2300 electronic communication networks. Although not shown explicitly in Figure 2, a vehicle can traverse an area that is not expressly or completely included in a vehicle transport network, such as an area outside of road.
[051] In some modalities, the electronic communication network 2300 can be, for example, a multiple access system and can allow communication, such as voice communication, data communication, video communication, message communication or a combination of between the 2100/2110 vehicle and one or more 2400 communication devices. For example, a 2100/2110 vehicle can receive information, such as information representing the 2200 vehicle transport network, from a 2400 communication device via the network 2300.
[052] In some modalities, a 2100/2110 vehicle can communicate via a wired communication link (not shown), a 2310/2320/2370 wireless communication link or a combination of any number of links of wired or wireless communication. For example, as shown, a 2100/2110 vehicle can communicate via a 2310 terrestrial wireless link, via a 2320 non-terrestrial wireless link,
Petition 870190075707, of 06/08/2019, p. 25/149
18/111 or through a combination thereof. In some implementations, a 2310 terrestrial wireless communication link may include an Ethernet link, a serial link, a Bluetooth link, an infrared (IR) link, an ultraviolet (UV) link or any link capable of allowing communication electronics.
[053] In some modalities, a 2100/2110 vehicle can communicate with another 2100/2110 vehicle. For example, a host vehicle (HV), or in question, 2100 may receive one or more messages between automated vehicles, such as a basic safety message (BSM), from a remote vehicle (RV), or target, 2110, by through a 2370 direct communication link, or through a 2300 network. For example, the remote vehicle 2110 can broadcast the message to host vehicles within a defined broadcast range, such as 300 meters. In some embodiments, the host vehicle 2100 may receive a message via an external entity, such as a signal repeater (not shown) or another remote vehicle (not shown). In some embodiments, a 2100/2110 vehicle may periodically transmit one or more messages between automated vehicles, based, for example, on a defined interval, such as 100 milliseconds.
[054] Messages between automated vehicles may include vehicle identification information, geospatial status information, such as longitude, latitude or elevation information, geospatial location accuracy information, kinematic status information, such as vehicle acceleration information, yaw rate information, speed information, vehicle course information, braking system status information, throttle information, steering wheel angle information, or vehicle route information, or operating status information vehicle information, such as vehicle size information, headlight status information, turning signal information, windshield wiper status information, transmission information, or any other
Petition 870190075707, of 06/08/2019, p. 26/149
19/111 information, or combination of information, pertinent to the state of vehicle transmitting. For example, transmission status information can indicate whether the transmission of the transmitting vehicle is in a neutral state, a parked state, a forward shift state or a backward shift state.
[055] In some embodiments, vehicle 2100 can communicate with the 2300 communications network through a 2330 access point. An 2330 access point, which may include a computing device, can be configured to communicate with a vehicle 2100, with a communication network 2300, with one or more communication devices 2400, or a combination thereof via wired or wireless communication links 2310/2340. For example, a 2330 access point can be a base station, a transceiver base station (BTS), a B-Node, an enhanced B-Node (eNode-B), a home Node-B (HNode-B), a wireless router, a wired router, a hub, a relay, a switch, or any similar wired or wireless device. Although shown as a single unit, an access point can include any number of interconnected elements.
[056] In some modalities, the 2100 vehicle can communicate with the 2300 communications network through a 2350 satellite, or other non-terrestrial communication device. A 2350 satellite, which may include a computing device, can be configured to communicate with a 2100 vehicle, with a 2300 communication network, with one or more 2400 communication devices, or with a combination of them via a or more 2320/2360 communication links. Although shown as a single unit, a satellite can include any number of interconnected elements.
[057] An electronic communication network 2300 can be any type of network configured to allow voice, data or any other type of communication
Petition 870190075707, of 06/08/2019, p. 27/149
20/111 electronics. For example, the electronic communications network 2300 may include a local area network (LAN), a wide area network (WAN), a virtual private network (VPN), a mobile or cellular phone network, the Internet, or any another electronic communication system. The electronic communication network 2300 can use a communication protocol, such as the transmission control protocol (TCP), the user datagram protocol (UDP), the Internet protocol (IP), the real-time transport protocol. (RTP), the Hypertext Transport Protocol (HTTP) or a combination thereof. Although shown as a single unit, an electronic communication network can include any number of interconnected elements.
[058] In some embodiments, a vehicle 2100 may identify a part or condition of the vehicle transport network 2200. For example, the vehicle may include one or more sensors in vehicle 2105, such as sensor 1360 shown in figure 1, the which may include a speed sensor, a wheel speed sensor, a camera, a gyroscope, an optical sensor, a laser sensor, a radar sensor, a sonic sensor, or any other sensor or device, or a combination thereof , capable of determining or identifying a part or condition of the 2200 vehicle transport network.
[059] In some embodiments, a 2100 vehicle can traverse part or parts of one or more vehicle transport networks 2200 using information transmitted via the 2300 network, such as information representing the vehicle transport network 2200, information identified by one or more sensors in vehicle 2105 or a combination thereof.
[060] Although, for simplicity, figure 2 shows a vehicle 2100, a vehicle transport network 2200, an electronic communication network 2300 and a communication device 2400, any number of vehicles, networks or computing devices can be used. In some modalities, the system of
Petition 870190075707, of 06/08/2019, p. 28/149
21/111 Transport and communication of vehicles 2000 can include devices, units or elements not shown in figure 2. Although vehicle 2100 is shown as a single unit, a vehicle can include any number of interconnected elements.
[061] Although vehicle 2100 is shown communicating with the 2400 communication device over the 2300 network, vehicle 2100 can communicate with the 2400 communication device via any number of direct or indirect communication links. For example, vehicle 2100 can communicate with the communication device 2400 through a direct communication link, such as a Bluetooth communication link.
[062] In some embodiments, a 2100/2210 vehicle can be associated with a 2500/2510 entity, such as a driver, operator or owner of the vehicle. In some embodiments, a 2500/2510 entity associated with a 2100/2110 vehicle can be associated with one or more personal electronic devices 2502/2504/2512/2514, such as a 2502/2512 smart phone or a 2504/2514 computer. In some embodiments, a personal electronic device 2502/2504/2512/2514 can communicate with a corresponding vehicle 2100/2110 via a direct or indirect communication link. Although a 2500/2510 entity is shown as associated with a 2100/2110 vehicle in figure 2, any number of vehicles can be associated with an entity and any number of entities can be associated with a vehicle.
[063] Figure 3 is a diagram of a part of a vehicle transport network according to this disclosure. A vehicle transport network 3000 may include one or more non-navigable areas 3100, such as a building, one or more partially navigable areas, such as parking area 3200, one or more navigable areas, such as 3300/3400 lanes or a combination of them. In some embodiments, an autonomous vehicle, such as the 1000 vehicle
Petition 870190075707, of 06/08/2019, p. 29/149
22/111 shown in figure 1, one of the 2100/2110 vehicles shown in figure 2, a semi-autonomous vehicle or any other vehicle implementing autonomous driving, can traverse part or parts of the vehicle transport network 3000.
[064] The vehicle transport network may include one or more 3210 interconnections between one or more navigable, or partially navigable, 3200/3300/3400 areas. For example, the part of the vehicle transport network shown in figure 3 includes an interconnection 3210 between parking area 3200 and via 3400. In some embodiments, parking area 3200 may include parking spaces 3220.
[065] A part of the vehicle transport network, such as a 3300/3400 lane, can include one or more 3320/3340/3360/3420/3440 lanes and can be associated with one or more travel directions, which are indicated by the arrows in figure 3.
[066] In some embodiments, a vehicle transport network, or a part of it, such as the part of the vehicle transport network shown in figure 3, can be represented as vehicle transport network information. For example, vehicle transport network information can be expressed as a hierarchy of elements, such as elements of markup language, which can be stored in a database or file. For simplicity, the figures in this document show vehicle transport network information representing parts of a vehicle transport network as diagrams or maps; however, vehicle transport network information can be expressed in any form usable by a computer capable of representing a vehicle transport network, or a part of it. In some embodiments, vehicle transport network information may include vehicle transport network control information, such as travel direction information, speed limit information, toll information, grade information, such as
Petition 870190075707, of 06/08/2019, p. 30/149
23/111 as inclination or angle information, surface material information, aesthetics information or a combination thereof.
[067] In some modalities, a part, or a combination of parts, of the vehicle transport network can be identified as a point of interest or a destination. For example, vehicle transport network information can identify a building, such as the non-navigable area 3100, and the adjacent partially navigable parking area 3200 as a point of interest, an autonomous vehicle can identify the point of interest as a destination, and the autonomous vehicle can travel from one source to the destination when crossing the vehicle transport network. Although parking area 3200 associated with non-navigable area 3100 is shown as adjacent to non-navigable area 3100 in figure 3, a destination may include, for example, a building and a parking area that is physically or geospatially non-adjacent to the building.
[068] In some modalities, identifying a destination may include identifying a location for the destination, which may be a distinctly uniquely identifiable geolocation. For example, the vehicle transport network may include a defined location, such as a street address, a postal address, a vehicle transport network address, a GPS address or a combination thereof for the destination.
[069] In some embodiments, a destination can be associated with one or more entries, such as entry 3500 shown in figure 3. In some embodiments, vehicle transport network information may include location information for defined entry, such as as information identifying a geolocation of an entry associated with a destination. In some embodiments, predicted entry location information can be determined as described in this document.
Petition 870190075707, of 06/08/2019, p. 31/149
24/111 [070] In some embodiments, the vehicle transport network may be associated with a pedestrian transport network or may include the same. For example, figure 3 includes a part 3600 of a pedestrian transport network, which can be a sidewalk. In some embodiments, a pedestrian transport network, or a part of it, such as part 3600 of the pedestrian transport network shown in Figure 3, can be represented as pedestrian transport network information. In some embodiments, vehicle transport network information may include pedestrian transport network information. A pedestrian transport network can include navigable pedestrian areas. A pedestrian navigable area, such as a walkway or sidewalk, may correspond to a non-navigable area of a vehicle transport network. Although not shown separately in Figure 3, a navigable pedestrian area, such as a pedestrian crossing, can correspond to a navigable area, or a partially navigable area, of a vehicle transport network.
[071] In some modalities, a destination can be associated with one or more departure and arrival locations, such as the 3700 departure and arrival location shown in figure 3. A 3700 departure and arrival location may be a designated location or area or not designated in the vicinity of a destination where an autonomous vehicle can stop, stay or park in such a way that boarding and disembarking operations, such as picking up or dropping off a passenger, can be performed.
[072] In some embodiments, vehicle transport network information may include information about boarding and disembarking location, such as information identifying a geolocation of one or more 3700 embarkation and disembarkation locations associated with a destination. In some modalities, the location information for boarding and disembarking can be
Petition 870190075707, of 06/08/2019, p. 32/149
25/111 boarding and disembarking location information defined, which can be boarding and disembarking location information included manually in the vehicle transport network information. For example, defined boarding and disembarking location information can be included in vehicle transport network information based on user input. In some embodiments, the boarding and landing location information may be automatically generated boarding and landing location information as described in this document. Although not shown separately in Figure 3, departure and arrival location information can identify a type of departure and arrival operation associated with a departure and departure location 3700. For example, a destination can be associated with a first departure and departure location. disembarkation to pick up passenger and a second embarkation and disembarkation location to leave passenger. Although an autonomous vehicle can park at a departure and arrival location, a departure and arrival location associated with a destination can be independent and distinct from a parking area associated with the destination.
[073] In an example, an autonomous vehicle can identify a point of interest, which can include the non-navigable area 3100, the parking area 3200 and entrance 3500, as a destination. The autonomous vehicle can identify the non-navigable area 3100, or entrance 3500, as a primary destination for the point of interest, and can identify parking area 3200 as a secondary destination. The autonomous vehicle can identify the 3700 embarkation and disembarkation location as an embarkation and disembarkation location for the primary destination. The autonomous vehicle can generate a route from one origin (not shown) to the 3700 embarkation and disembarkation location. The autonomous vehicle can traverse the vehicle transport network from the origin to the 3700 embarkation and disembarkation location using the route. The autonomous vehicle may stop or
Petition 870190075707, of 06/08/2019, p. 33/149
11/26 park at the boarding and disembarkation location 3700 in such a way that the pick up or drop off operation can be performed. The autonomous vehicle can generate a subsequent route from the 3700 embarkation and disembarkation location to the 3200 parking area, can traverse the vehicle transport network from the 3700 embarkation and disembarkation location to the 3200 parking area using the subsequent route, and can park in the 3200 parking area.
[074] Figure 4 is a diagram of an example of an autonomous vehicle operational management system 4000 according to the modalities of this disclosure. The autonomous vehicle operational management system 4000 can be implemented in an autonomous vehicle, such as vehicle 1000 shown in figure 1, in one of the 2100/2110 vehicles shown in figure 2, in a semi-autonomous vehicle or in any other vehicle implementing driving autonomous.
[075] An autonomous vehicle may pass through a vehicle transport network, or part of it, which may include traversing different vehicle operational scenarios. A distinct vehicle operating scenario may include any distinctly identifiable set of operating conditions that may affect the operation of the autonomous vehicle within a defined space-time area, or operating environment, of the autonomous vehicle. For example, a distinct vehicle operating scenario may be based on a number or cardinality of lanes, lane segments or lanes that the autonomous vehicle can traverse within a defined space-time distance. In another example, a distinct vehicle operating scenario may be based on one or more traffic control devices that may affect the operation of the autonomous vehicle within a defined space-time area, or operating environment, of the autonomous vehicle. In another example, a distinct vehicle operating scenario may be based on one or more identifiable rules, regulations or laws that may affect the operation of the autonomous vehicle within a defined space-time area, or operating environment, of the vehicle.
Petition 870190075707, of 06/08/2019, p. 34/149
27/111 autonomous vehicle. In another example, a distinct vehicle operating scenario may be based on one or more identifiable external objects that may affect the operation of the autonomous vehicle within a defined space-time area, or operational environment, of the autonomous vehicle.
[076] Examples of distinct vehicle operating scenarios including a distinct vehicle operating scenario in which the autonomous vehicle is crossing an intersection, a distinct vehicle operating scenario in which a pedestrian is crossing or approaching the expected path of the autonomous vehicle, and a distinct vehicle operating scenario in which the autonomous vehicle is changing lanes.
[077] For simplicity and clarity, similar vehicle operating scenarios can be described in this document with reference to types or classes of vehicle operating scenarios. For example, vehicle operating scenarios including pedestrians may be referred to in this document as pedestrian scenarios referring to the types or classes of vehicle operating scenarios that include pedestrians. As an example, a first vehicle operating scenario including a pedestrian may include a pedestrian crossing a road at a pedestrian crossing and a second vehicle operating scenario including a pedestrian may include a pedestrian crossing a road without observing the crossing rules. Although vehicle operating scenarios including pedestrians, vehicle operating scenarios including intersections and vehicle operating scenarios including lane changes are described in this document, any other vehicle operating scenario or type of vehicle operating scenario may be used.
[078] Aspects of the operating environment of the autonomous vehicle can be represented within the respective operating scenarios of different vehicles. For example, the relative orientation, trajectory and expected path of external objects
Petition 870190075707, of 06/08/2019, p. 35/149
11/28 may be represented within the respective operating scenarios for different vehicles. In another example, the relative geometry of the vehicle transport network can be represented within the respective different vehicle operating scenarios.
[079] As an example, a first distinct vehicle operating scenario may correspond to a pedestrian crossing a road in a pedestrian crossing lane, and a relative orientation and expected pedestrian path, such as crossing from left to right to cross from right to left, they can be represented within the first distinct vehicle operating scenario. A second distinct vehicle operating scenario may correspond to a pedestrian crossing a road without observing the crossing rules, and a relative orientation and expected path of the pedestrian, such as crossing from left to right to cross from right to left, can be represented within the second distinct vehicle operating scenario.
[080] In some modalities, an autonomous vehicle can cross multiple different vehicle operating scenarios within an operating environment, which may have aspects of a composite vehicle operating scenario. For example, a pedestrian may approach the expected path for the autonomous vehicle to cross an intersection.
[081] The autonomous vehicle operational management system 4000 can operate or control the autonomous vehicle to traverse distinct vehicle operating scenarios subject to defined restrictions, such as safety restrictions, legal restrictions, physical restrictions, user acceptability restrictions, or any other restriction or combination of restrictions that can be defined or derived for the operation of the autonomous vehicle.
[082] In some modalities, controlling the autonomous vehicle to cross the operational scenarios of different vehicles may include identifying or detecting the
Petition 870190075707, of 06/08/2019, p. 36/149
11/29 distinct vehicle operating scenarios, identifying candidate vehicle control actions based on different vehicle operating scenarios, controlling the autonomous vehicle to traverse a part of the vehicle transport network according to one or more of the vehicle control actions vehicle candidates, or a combination thereof.
[083] A vehicle control action may indicate a vehicle control operation or maneuver, such as accelerating, decelerating, turning, stopping or any other vehicle operation or combination of vehicle operations that can be performed by the autonomous vehicle in combination with crossing a part of the vehicle transport network.
[084] The 4100 autonomous vehicle operational management controller, or another autonomous vehicle unit, can control the autonomous vehicle to traverse the vehicle transport network, or a part of it, according to a vehicle control action .
[085] For example, the 4100 autonomous vehicle operational management controller can control the autonomous vehicle to traverse the vehicle transport network, or a part of it, according to a vehicle control action 'stop' when stopping the vehicle. autonomous vehicle or otherwise control the autonomous vehicle to become or remain stationary.
[086] In another example, the 4100 autonomous vehicle operational management controller can control the autonomous vehicle to traverse the vehicle transport network, or a part of it, according to a 'forward' vehicle control action when move slowly forward a short distance, such as a few inches (centimeters) or a foot (30.48 centimeters).
[087] In another example, the 4100 autonomous vehicle operational management controller can control the autonomous vehicle to traverse the vehicle transport network, or a part of it, according to a control action
Petition 870190075707, of 06/08/2019, p. 37/149
30/111 vehicle ‘accelerate’ when accelerating at a defined acceleration rate, or at an acceleration rate within a defined range.
[088] In another example, the 4100 autonomous vehicle operational management controller can control the autonomous vehicle to traverse the vehicle transport network, or a part of it, according to a vehicle control action 'decelerate' when decelerate by a defined deceleration rate, or by a deceleration rate within a defined range.
[089] In another example, the 4100 autonomous vehicle operational management controller can control the autonomous vehicle to traverse the vehicle transport network, or a part of it, according to a 'keep' vehicle control action. control the autonomous vehicle to cross the vehicle transport network, or a part of it, according to current operating parameters, such as maintaining a current speed, maintaining a current path or route, maintaining a current lane or thing orientation similar.
[090] In another example, the 4100 autonomous vehicle operational management controller can control the autonomous vehicle to traverse the vehicle transport network, or a part of it, according to a vehicle control action 'proceed' to controlling the autonomous vehicle to cross the vehicle transport network, or a part of it, by starting or restarting a previously identified set of operational parameters, which may include controlling the autonomous vehicle to cross the vehicle transport network, or a part of it, according to one or more other vehicle control actions. For example, the autonomous vehicle may be stationary at an intersection, an identified route for the autonomous vehicle may include crossing the intersection, and controlling the autonomous vehicle in accordance with a 'continue' vehicle control action may include controlling the autonomous vehicle for accelerate at a defined acceleration rate
Petition 870190075707, of 06/08/2019, p. 38/149
31/111 for a defined speed along the identified path. In another example, the autonomous vehicle can traverse a part of the vehicle transport network at a defined speed, a change of lane can be identified for the autonomous vehicle, and control the autonomous vehicle according to a control action. vehicle 'proceed' may include controlling the autonomous vehicle to perform a sequence of trajectory adjustments according to lane change parameters defined in such a way that the autonomous vehicle performs the identified lane change operation.
[091] In some modalities, a vehicle control action may include one or more performance metrics. For example, a 'stop' vehicle control action can include a deceleration rate as a performance metric. In another example, a vehicle control action 'proceed' may expressly indicate route or path information, speed information, an acceleration rate or a combination of these as performance metrics, or may expressly or implicitly indicate that a current or previously identified path, speed, acceleration rate, or a combination of them, can be maintained.
[092] In some embodiments, a vehicle control action may be a composite vehicle control action, which may include a sequence, combination, or both vehicle control actions. For example, a 'forward' vehicle control action may indicate a 'stop' vehicle control action, a subsequent 'accelerate' vehicle control action associated with a defined acceleration rate, and a vehicle control action ' subsequent 'stop associated with a defined deceleration rate, in such a way that controlling the autonomous vehicle according to the' forward 'vehicle control action includes controlling the autonomous vehicle to advance slowly over a small distance, such as a few inches (centimeters ) or a foot (30.48 centimeters).
Petition 870190075707, of 06/08/2019, p. 39/149
32/111 [093] In some embodiments, the autonomous vehicle operational management system 4000 may include a 4100 autonomous vehicle operational management controller, a 4200 lock monitor, the 4300 operating environment monitors, the control evaluation modules 4400 scenario-specific operating systems or a combination of them. Although described separately, the 4200 lock monitor can be an instance, or instances, of a 4300 operating environment monitor.
[094] The 4100 autonomous vehicle operational management controller may receive, identify, or otherwise access, operating environment information representing an operating environment for the autonomous vehicle, such as a current operating environment or an expected operating environment, or a or more aspects of it. The operating environment of the autonomous vehicle may include a distinctly identifiable set of operating conditions that may affect the operation of the autonomous vehicle within a defined space-time area of the autonomous vehicle.
[095] For example, operating environment information may include vehicle information for the autonomous vehicle, such as information indicating a geospatial location of the autonomous vehicle, information correlating the geospatial location of the autonomous vehicle with information representing the vehicle transport network, an autonomous vehicle route, an autonomous vehicle speed, an autonomous vehicle acceleration state, autonomous vehicle passenger information, or any other information regarding the autonomous vehicle or the operation of the autonomous vehicle.
[096] In another example, operating environment information may include information representing the vehicle transport network close to the autonomous vehicle, such as within a defined spatial distance from the autonomous vehicle, such as 300 meters, information indicating the geometry of one or more aspects of
Petition 870190075707, of 06/08/2019, p. 40/149
33/111 vehicle transport network, information indicating a condition, such as a surface condition, of the vehicle transport network, or any combination thereof.
[097] In another example, operating environment information may include information representing external objects within the operating environment of the autonomous vehicle, such as information representing pedestrians, animals, non-motorized transport devices such as bicycles or skateboards, transport devices motorized vehicles, such as remote vehicles, or any other object or external entity that may affect the operation of the autonomous vehicle.
[098] In some modalities, the 4100 autonomous vehicle operational management controller can monitor the autonomous vehicle's operating environment, or defined aspects of it. In some modalities, monitoring the operating environment of the autonomous vehicle may include identifying and tracking external objects, identifying different vehicle operating scenarios or a combination of them.
[099] For example, the 4100 autonomous vehicle operational management controller can identify and track objects external to the autonomous vehicle's operating environment. Identifying and tracking external objects may include identifying spatio-temporal locations of respective external objects, which may be in relation to the autonomous vehicle, identifying one or more expected paths for respective external objects, which may include identifying a speed, a trajectory, or both, for an external object. For simplicity and clarity, descriptions of locations, expected locations, paths, expected paths and more in this document may omit express indications that the corresponding locations and paths refer to geospatial and temporal components; however, unless expressly stated in this document, or otherwise unequivocally clarified by context, the locations, locations
Petition 870190075707, of 06/08/2019, p. 41/149
34/111, paths, expected paths and others more described in this document may include geospatial components, temporal components or both.
[0100] In some embodiments, the 4300 operating environment monitors may include a 4310 operating environment monitor to monitor pedestrians (pedestrian monitor), a 4320 operating environment monitor to monitor crossings (crossing monitor), an operating environment monitor 4330 to monitor lane changes (lane change monitor) or a combination thereof. A 4340 operating environment monitor is shown using dashed lines to indicate that the autonomous vehicle operational management system 4000 can include any number of the 4300 operating environment monitors.
[0101] One or more distinct vehicle operating scenarios can be monitored by a respective 4300 operating environment monitor. For example, the 4310 pedestrian monitor can monitor operating environment information corresponding to multiple vehicle operating scenarios including pedestrians, the vehicle intersection 4320 can monitor operating environment information corresponding to multiple vehicle operating scenarios including intersections, and the 4330 lane change monitor can monitor operating environment information corresponding to multiple vehicle operating scenarios including lane changes.
[0102] A 4300 operating environment monitor can receive, or otherwise access, operating environment information, such as operating environment information generated or captured by one or more autonomous vehicle sensors, vehicle transport network information, information of vehicle transport network geometry or a combination thereof. For example, the 4310 operating environment monitor for monitoring pedestrians can receive, or otherwise access, information, such as sensor information, which can indicate,
Petition 870190075707, of 06/08/2019, p. 42/149
35/111 correspond or otherwise be associated with one or more pedestrians in the operating environment of the autonomous vehicle.
[0103] In some modalities, a 4300 operating environment monitor can associate the operating environment information, or a part of it, with the operating environment, or with an aspect of it, such as with an external object, such as a pedestrian , a remote vehicle, or an aspect of vehicle transport network geometry.
[0104] In some embodiments, a 4300 operating environment monitor may generate, or otherwise identify, information representing one or more aspects of the operating environment, such as with an external object, such as a pedestrian, a remote vehicle or an aspect vehicle transport network geometry, which may include filtering, abstracting or otherwise processing operating environment information.
[0105] In some embodiments, a 4300 operating environment monitor can produce information representing one or more aspects of the operating environment for the 4100 autonomous vehicle operational management controller or for access by it, such as when storing information representing the one or more aspects of the operating environment in a memory, such as memory 1340 shown in figure 1, of the autonomous vehicle accessible by the 4100 autonomous vehicle operational management controller, sending the information representing the one or more aspects of the operating environment to the controller 4100 autonomous vehicle operational management system or a combination thereof. In some embodiments, a 4300 operating environment monitor can send information representing one or more aspects of the operating environment to one or more elements of the autonomous vehicle operational management system 4000, such as the 4200 lock monitor.
[0106] For example, the 4310 operating environment monitor to monitor
Petition 870190075707, of 06/08/2019, p. 43/149
36/111 pedestrians can correlate, associate or otherwise process operating environment information to identify, track or predict actions by one or more pedestrians. For example, the 4310 pedestrian environment monitor can receive information, such as sensor information, from one or more sensors, which can correspond to one or more pedestrians, the 4310 pedestrian environment monitor can associate with sensor information with one or more identified pedestrians, which may include identifying a direction of travel, a path, such as an expected path, a current or expected speed, a current or expected acceleration rate, or a combination of them for a or more of the respective identified pedestrians, and the 4310 operating environment monitor to monitor pedestrians can produce identified, associated or generated pedestrian information for or for access by the 4100 autonomous vehicle operational management controller.
[0107] In another example, the 4320 operating environment monitor to monitor intersections can correlate, associate or otherwise process operating environment information to identify, track or predict actions of one or more remote vehicles in the autonomous vehicle's operating environment , to identify an intersection, or an aspect thereof, in the operating environment of the autonomous vehicle, to identify vehicle transport network geometry or a combination thereof. For example, the 4310 operating environment monitor for monitoring intersections can receive information, such as sensor information, from one or more sensors, which can correspond to one or more vehicles away in the operating environment of the autonomous vehicle, the intersection, or a or more aspects of it, in the operating environment of the autonomous vehicle, the geometry of the vehicle transport network, or a combination thereof, the 4310 operating environment monitor to monitor intersections can associate sensor information with one or more remote vehicles identified in the vehicle's operating environment
Petition 870190075707, of 06/08/2019, p. 44/149
37/111 autonomous, crossing, or one or more aspects thereof, in the operating environment of the autonomous vehicle, the geometry of the vehicle transport network, or a combination thereof, which may include identifying a current direction of travel or expected, a path, such as an expected path, a current or expected speed, a current or expected acceleration rate, or a combination of these for one or more of the respective identified identified remote vehicles, and the 4320 operating environment monitor to monitor intersections can produce the intersection information identified, associated or generated for the 4100 autonomous vehicle operational management controller or for access by it.
[0108] In another example, the 4330 operating environment monitor to monitor lane change can correlate, associate or otherwise process operating environment information to identify, track or predict actions of one or more vehicles at a distance in the environment operating vehicle, such as information indicating a slow or stationary vehicle away along the expected path of the autonomous vehicle, to identify one or more aspects of the autonomous vehicle's operating environment, such as vehicle transport network geometry in the operating environment of the vehicle. autonomous vehicle, or a combination thereof corresponding geospatially to a current or expected lane change operation. For example, the 4330 operating environment monitor for monitoring lane change can receive information, such as sensor information, from one or more sensors, which can correspond to one or more vehicles away in the operating environment of the autonomous vehicle, to one or more aspects of the autonomous vehicle's operating environment in the autonomous vehicle's operating environment or a combination thereof corresponding geospatially to a current or expected lane change operation, the 4330 operating environment monitor for
Petition 870190075707, of 06/08/2019, p. 45/149
38/111 monitoring lane change can associate sensor information with one or more remote vehicles identified in the operating environment of the autonomous vehicle, with one or more aspects of the operating environment of the autonomous vehicle or with a combination of them corresponding geospatially a current or expected lane change operation, which may include identifying a current or expected direction of travel, a path, such as an expected path, current or expected speed, current or expected acceleration rate, or a combination of them for one or more of the respective identified remote vehicles, and the 4330 operating environment monitor for monitoring intersections can produce the identified, associated or generated lane change information for the 4100 autonomous vehicle operational management controller or for access by him.
[0109] The 4100 autonomous vehicle operational management controller can identify one or more distinct vehicle operating scenarios based on one or more aspects of the operating environment represented by the operating environment information. For example, the 4100 autonomous vehicle operational management controller can identify a distinct vehicle operating scenario in response to identifying, or based on, the operating environment information indicated by one or more of the 4300 operating environment monitors.
[0110] In some modalities, the 4100 autonomous vehicle operational management controller can identify multiple distinct vehicle operating scenarios based on one or more aspects of the operating environment represented by the operating environment information. For example, operating environment information can include information representing a pedestrian approaching an intersection along an expected path for the autonomous vehicle, and the 4100 autonomous vehicle operational management controller can identify a vehicle operational scenario including pedestrian, an operational scenario of
Petition 870190075707, of 06/08/2019, p. 46/149
39/111 vehicle including intersection or both.
[0111] The 4100 autonomous vehicle operational management controller can instantiate respective instances of one or more of the 4400 scenario-specific operational control assessment modules based on one or more aspects of the operating environment represented by the operating environment information. For example, the 4100 autonomous vehicle operational management controller can instantiate the 4400 scenario-specific operational control evaluation module instance in response to identifying the distinct vehicle operating scenario.
[0112] In some embodiments, the 4100 autonomous vehicle operational management controller can instantiate multiple instances of one or more 4400 scenario-specific operational control assessment modules based on one or more aspects of the operating environment represented by the operating environment information . For example, the operating environment information can indicate two pedestrians in the operating environment of the autonomous vehicle and the 4100 autonomous vehicle operational management controller can instantiate a respective instance of the 4410 pedestrian scenario specific operational control assessment module for each pedestrian with based on one or more aspects of the operating environment represented by the operating environment information.
[0113] In some modalities, the cardinality, number or count of identified external objects, such as pedestrians or remote vehicles, corresponding to a scenario, such as the pedestrian scenario, the crossing scenario or the lane change scenario , may exceed a defined threshold, which may be a specific scenario-defined threshold, and the 4100 autonomous vehicle operational management controller may fail to instantiate an instance of a 4400 scenario-specific operational control evaluation module corresponding to one or more of the identified external objects.
Petition 870190075707, of 06/08/2019, p. 47/149
40/111 [0114] For example, the operating environment information indicated by the 4300 operating environment monitors can indicate twenty-five pedestrians in the autonomous vehicle's operating environment, the threshold set for the pedestrian scenario can be a defined cardinality, such as ten , pedestrian, the 4100 autonomous vehicle operational management controller can identify the ten most relevant pedestrians, such as the ten pedestrians closest geospatially to the autonomous vehicle having expected paths converging to the autonomous vehicle, the vehicle operational management controller autonomous 4100 can instantiate ten instances of the 4410 pedestrian scenario specific operational control assessment module for the ten most relevant pedestrians, and the 4100 autonomous vehicle operational management controller may fail to instantiate instances of the specific operational control assessment module of pedestrian scenario 4410 for chi nze other pedestrians.
[0115] In another example, the operating environment information indicated by the 4300 operating environment monitors may indicate an intersection including four road segments, such as a northern road segment, a southern road segment, a southern segment lane in the east direction and a lane segment in the west direction, and indicate five spaced vehicles corresponding to the lane segment in the north direction, three spaced vehicles corresponding to the lane segment in the south direction, four spaced vehicles corresponding to the lane segment in the direction towards east and two vehicles apart corresponding to the track segment in the west direction, the threshold defined for the crossing scenario can be a defined cardinality, such as two, vehicles apart by track segment, the 4100 autonomous vehicle operational management controller can identify the two most relevant remote vehicles by road segment, such as the two most remote vehicles geospatial form at the intersection with expected paths converging to the autonomous vehicle by road segment, the
Petition 870190075707, of 06/08/2019, p. 48/149
41/111 4100 autonomous vehicle operational management controller can instantiate two instances of the 4420 intersection scenario specific operational control evaluation module for the two most relevant remote vehicles corresponding to the northbound track segment, two instances of the evaluation module 4420 intersection scenario specific operational control for the two most relevant remote vehicles corresponding to the track segment in the south direction, two instances of the 4420 intersection scenario specific operational control evaluation module for the two most relevant remote vehicles corresponding to the segment lane in the east direction, and two instances of the 4420 crossing scenario specific operational control assessment module for the two spaced vehicles corresponding to the lane segment in the west direction, and the 4100 autonomous vehicle operational management controller may fail to instantiate instances of m 4420 intersection scenario specific operational control assessment module for the other three vehicles away corresponding to the track segment in the north direction, for the other vehicle away corresponding to the track segment in the south direction, and for the other two vehicles away corresponding to the lane segment in the east direction. Alternatively, or in addition, the threshold set for the crossing scenario can be a defined cardinality, such as eight, vehicles away from each other, and the 4100 autonomous vehicle operational management controller can identify the eight most relevant away vehicles for the crossing , such as the eight vehicles geospatially closest to the intersection having expected paths converging to the autonomous vehicle, the 4100 autonomous vehicle operational management controller can instantiate eight instances of the 4420 crossing-specific operational control assessment module for the eight most relevant remote vehicles, and the 4100 autonomous vehicle operational management controller can stop instantiating instances of the operational control evaluation module
Petition 870190075707, of 06/08/2019, p. 49/149
42/111 specific 4420 intersection scenario for the other six vehicles apart.
[0116] In some embodiments, the 4100 autonomous vehicle operational management controller can send the operating environment information, or one or more aspects of it, to another unit of the autonomous vehicle, such as the 4200 lock monitor or one or more instances of 4400 scenario-specific operational control assessment modules.
[0117] In some embodiments, the 4100 autonomous vehicle operational management controller can store the operating environment information, or one or more aspects of it, such as in a memory, such as the 1340 memory shown in figure 1, of the vehicle autonomous.
[0118] The 4100 autonomous vehicle operational management controller can receive candidate vehicle control actions from respective instances of 4400 scenario-specific operational control evaluation modules. For example, a candidate vehicle control action from a first instance of a first 4400 scenario-specific operational control assessment module may indicate a 'stop' vehicle control action, a candidate vehicle control action from a second instance of a second 4400 scenario-specific operational control assessment module may indicate a 'forward' vehicle control action, and a candidate vehicle control action from a third instance of a third 4400 scenario-specific operational control assessment module may indicate a 'continue' vehicle control action.
[0119] The 4100 autonomous vehicle operational management controller can determine whether to cross a part of the vehicle transport network according to one or more candidate vehicle control actions. For example, the 4100 autonomous vehicle operational management controller can receive multiple candidate vehicle control actions from multiple instances of 4400 scenario-specific operational control evaluation modules, can
Petition 870190075707, of 06/08/2019, p. 50/149
43/111 identify a vehicle control action from the candidate vehicle control actions, and can traverse the vehicle transport network according to the vehicle control action.
[0120] In some modalities, the 4100 autonomous vehicle operational management controller can identify a vehicle control action from the candidate vehicle control actions based on one or more defined vehicle control action identification metrics.
[0121] In some embodiments, the defined vehicle control action identification metrics may include a priority, classification or weight associated with each type of vehicle control action, and identify the vehicle control action of vehicle control actions. Candidate vehicle control may include identifying a higher priority vehicle control action from candidate vehicle control actions. For example, the vehicle control action 'stop' can be associated with a high priority, the vehicle control action 'forward' can be associated with an intermediate priority, which can be lower than the high priority, and the vehicle control action 'proceed' can be associated with a low priority, which may be lower than the intermediate priority. In one example, candidate vehicle control actions may include one or more vehicle control actions ‘stop’, and the vehicle control action ‘stop’ can be identified as the vehicle control action. In another example, candidate vehicle control actions may omit a 'stop' vehicle control action, may include one or more 'forward' vehicle control actions, and the 'forward' vehicle control action may be identified as the vehicle control action. In another example, candidate vehicle control actions may omit a 'stop' vehicle control action, may omit a 'forward' vehicle control action, may include one or more 'continue' vehicle control actions, and the vehicle control action 'proceed' can be identified as the control action
Petition 870190075707, of 06/08/2019, p. 51/149
44/111 of vehicle.
[0122] In some modalities, identifying the vehicle control action of the candidate vehicle control actions may include generating or calculating a weighted average for each type of vehicle control action based on the vehicle control action identification metrics defined, in the instantiated scenarios, weights associated with the instantiated scenarios, in the candidate vehicle control actions, weights associated with the candidate vehicle control actions or in a combination thereof.
[0123] For example, identifying the vehicle control action of the candidate vehicle control actions may include implementing a machine learning component, such as supervised learning of a classification problem, and training the machine learning component using examples , such as 1,000 examples, of the corresponding vehicle operating scenario. In another example, identifying the vehicle control action of the candidate vehicle control actions may include implementing a Markov Decision Process, or a Partially Observable Markov Decision Process, which can describe how the respective vehicle control actions Candidate vehicle affects subsequent candidate vehicle control actions, and may include a reward function that produces a positive or negative reward for the respective vehicle control actions.
[0124] The 4100 autonomous vehicle operational management controller may not instantiate an instance of a 4400 scenario-specific operational control evaluation module. For example, the 4100 autonomous vehicle operational management controller can identify a distinct set of operating conditions. as indicating a distinct vehicle operating scenario for the autonomous vehicle, instantiating an instance of a 4400 scenario-specific operational control assessment module for the distinct vehicle operating scenario,
Petition 870190075707, of 06/08/2019, p. 52/149
45/111 monitor operating conditions, subsequently determine that one or more of the operating conditions have expired, or are likely to affect the operation of the autonomous vehicle below a defined threshold, and the 4100 autonomous vehicle operational management controller may not instantiate the 4400 scenario-specific operational control assessment module instance.
[0125] The 4200 lock monitor can receive operating environment information representing an operating environment, or an aspect of it, for the autonomous vehicle. For example, the 4200 lock monitor can receive the operating environment information from the 4100 autonomous vehicle operational management controller, from an autonomous vehicle sensor, from an external device, such as a remote vehicle or an infrastructure device or a combination of the same. In some embodiments, the lock monitor 4200 can read the operating environment information, or a part of it, in a memory, such as an autonomous vehicle memory, such as memory 1340 shown in figure 1.
[0126] Although not shown expressly in figure 4, the autonomous vehicle operational management system 4000 can include a predictor module that can generate and send prediction information to the 4200 lock monitor, and the 4200 lock monitor can produce probability of availability information for one or more of the 4300 operating environment monitors.
[0127] The 4200 blocking monitor can determine a corresponding availability probability, or corresponding blocking probability, for one or more parts of the vehicle transport network, such as parts of the vehicle transport network proximal to the autonomous vehicle, which they may include parts of the vehicle transport network corresponding to an expected path of the autonomous vehicle, such as an expected path identified based on a current route of the autonomous vehicle.
Petition 870190075707, of 06/08/2019, p. 53/149
46/111 [0128] An availability probability, or corresponding blocking probability, may indicate a probability or possibility that the autonomous vehicle can safely cross a part or spatial location within the vehicle transport network, as not prevented by a external object, such as a remote vehicle or pedestrian. For example, a part of the vehicle transport network may include an obstruction, such as a stationary object, and an availability probability for the part of the vehicle transport network may be low, such as 0%, which can be expressed as a high blocking probability, such as 100%, for the vehicle transport network part.
[0129] The 4200 interlock monitor can identify a respective availability probability for each of the multiple parts of the vehicle transport network within an operational environment, such as within 300 meters, of the autonomous vehicle.
[0130] In some modalities, the 4200 blocking monitor can identify a part of the vehicle transport network and a corresponding probability of availability based on operation information for the autonomous vehicle, operation information for one or more external objects, information vehicle transport network representing the vehicle transport network or a combination thereof. In some embodiments, the operating information for the autonomous vehicle may include information indicating a geospatial location of the autonomous vehicle in the vehicle transport network, which may be a current location or an expected location, such as an expected location identified based on an expected path for the autonomous vehicle. In some embodiments, the operation information for external objects may indicate a respective geospatial location of one or more external objects in or near the vehicle transport network, which may be a current location or an expected location, such as a expected location
Petition 870190075707, of 06/08/2019, p. 54/149
47/111 identified based on an expected path for the respective external object.
[0131] In some embodiments, an availability probability can be indicated by the 4200 lock monitor corresponding to each external object in the autonomous vehicle's operating environment and a geospatial area can be associated with multiple availability probabilities corresponding to multiple external objects. In some embodiments, an aggregated availability probability can be indicated by the 4200 lock monitor corresponding to each type of external object in the operating environment of the autonomous vehicle, such as a probability of availability for pedestrians and a probability of availability for remote vehicles, and a geospatial area can be associated with multiple availability probabilities corresponding to multiple types of external objects. In some embodiments, the 4200 lock monitor can indicate an aggregate availability probability for each geospatial location, which can include multiple time availability probabilities for a geographic location.
[0132] In some embodiments, the 4200 lock monitor can identify external objects, track external objects, project location information, path information, or both for external objects or for a combination of them. For example, the 4200 lock monitor can identify an external object and can identify an expected path to the external object, which can indicate a sequence of expected spatial locations, expected time locations and corresponding probabilities.
[0133] In some modalities, the blocking monitor can identify the expected path to an external object based on information from the operational environment, such as information indicating a current location of the external object, information indicating a current path for the external object, information indicating a type of classification of the external object, such as information
Petition 870190075707, of 06/08/2019, p. 55/149
48/111 classifying the external object as a pedestrian or remote vehicle, vehicle transport network information, such as information indicating that the vehicle transport network includes a crosswalk for pedestrians close to the external object, information identified or tracked previously associated with the external object or any combination thereof. For example, the external object can be identified as a vehicle away, and the expected path for the vehicle away can be identified based on information indicating a current location of the vehicle away, information indicating a current path of the vehicle away, information indicating a speed current of the vehicle away, vehicle transport network information corresponding to the vehicle away, legal or regulatory information or a combination thereof.
[0134] In some embodiments, the 4200 lock monitor can determine, or update, availability probabilities continuously or periodically. In some embodiments, one or more classes or types of external objects can be identified as preferably blocking, and the expected path of an external object preferably blocking can spatially and temporally overlap the expected path of another external object preferably blocking . For example, a pedestrian's expected path may overlap another pedestrian's expected path. In some embodiments, one or more classes or types of external objects can be identified as preferably blocking, and the expected path of an external object preferably blocking can be blocked, as prevented or otherwise affected, by other external objects. For example, the expected path for a remote vehicle can be blocked by another remote vehicle or by a pedestrian.
[0135] In some embodiments, the 4200 lock monitor can identify expected paths for external objects, preferably locking, such as
Petition 870190075707, of 06/08/2019, p. 56/149
49/111 pedestrians, and can identify expected paths for external objects preferably blocking, such as remote vehicles, subject to the paths expected for external objects preferably blocking. In some embodiments, the 4200 lock monitor can provide availability probabilities, or corresponding lock probabilities, for the 4100 autonomous vehicle operational management controller. The 4100 autonomous vehicle operational management controller can provide the availability probabilities, or probabilities. corresponding locking points, for respective instances instantiated from 4400 scenario-specific operational control assessment modules.
[0136] Each 4400 scenario-specific operational control assessment module can model a respective distinct vehicle operating scenario. The autonomous vehicle operational management system 4000 can include any number of 4400 scenario-specific operational control assessment modules, each modeling a respective distinct vehicle operating scenario.
[0137] In some modalities, modeling a different vehicle operating scenario, using a 4400 scenario specific operational control assessment module, may include generating or maintaining, or both, state information representing aspects of a vehicle's operating environment autonomous corresponding to the distinct vehicle operating scenario, identify potential interactions between the respective modeled aspects of the corresponding states, and determine a candidate vehicle control action that resolves the model. In some modalities, aspects of the autonomous vehicle's operating environment other than the defined set of aspects of the autonomous vehicle's operating environment corresponding to the distinct vehicle operating scenario can be omitted from the model.
Petition 870190075707, of 06/08/2019, p. 57/149
50/111 [0138] The autonomous vehicle operational management system 4000 can be solution independent and can include any model of a distinct vehicle operating scenario, such as a single agent model, a multi-agent model, a model or any other model of one or more different vehicle operating scenarios.
[0139] One or more of the 4400 scenario specific operational control assessment modules can be a Classic Planning (CP) model, which can be a single agent model, and which can model a distinct vehicle operating scenario with based on a defined input state, which can indicate respective non-probabilistic states of the elements of the autonomous vehicle's operating environment for the distinct vehicle operating scenario modeled by the 4400 scenario-specific operational control assessment modules. In a Classic Planning model , one or more aspects, such as geospatial location, of modeled elements, such as external objects, associated with a temporal location may differ from the corresponding aspects associated with another temporal location, such as an immediately subsequent temporal location, not probabilistically, such as by a defined or fixed amount. For example, at a first temporal location, a remote vehicle may have a first geospatial location, and at a second immediately subsequent temporal location the remote vehicle may have a second geospatial location that differs from the first geospatial location by a defined geospatial distance, such as a defined number of meters, along an expected path for the vehicle away.
[0140] One or more of the 4400 scenario-specific operational control assessment modules can be a distinct time-stochastic control process, such as a Markov Decision Process (MDP) model, which can be a model of a single agent, and that can model an operational scenario
Petition 870190075707, of 06/08/2019, p. 58/149
51/111 distinct vehicle based on a defined entry state. Changes to the operating environment of the autonomous vehicle, such as a change of location to an external object, can be modeled as probabilistic changes. A Markov Decision Process model can use more processing resources and can model the different vehicle operating scenario more precisely than a Classic Planning (CP) model.
[0141] A Markov Decision Process model can model a distinct vehicle operating scenario as a sequence of time locations, such as a current time location, future time locations, or both, with corresponding states, such as a current state, expected future states, or both. At each time location the model can have a state, which can be an assumed state, and which can be associated with one or more candidate vehicle control actions. The model can represent the autonomous vehicle as an agent, which can change, over the sequence of temporal locations, from one state (a current state) to another state (subsequent state) according to an action identified for the current state and a probability that the identified action will change the state from the current state to the subsequent state.
[0142] The model can result in a reward, which can be a positive or negative value, corresponding to changing from one state to another according to the respective action. The model can solve the different vehicle operating scenario by identifying the actions corresponding to each state in the sequence of temporal locations that maximize the cumulative reward. Resolving a model may include identifying a vehicle control action in response to the modeled scenario and operating environment information.
[0143] A Markov Decision Process model can model a distinct vehicle operating scenario using a set of states, a set
Petition 870190075707, of 06/08/2019, p. 59/149
52/111 of actions, a set of state transition probabilities, a reward function or a combination of them. In some modalities, modeling a different vehicle operating scenario may include using a discount factor, which can adjust, or discount, the output of the reward function applied to subsequent time periods.
[0144] The set of states can include a current state of the Markov Decision Process model, one or more possible subsequent states of the Markov Decision Process model or a combination thereof. A state can represent an identified condition, which can be an expected condition, of respective defined aspects, such as external objects and traffic control devices, of the autonomous vehicle's operating environment that are likely to affect the autonomous vehicle's operation at a location distinct temporal. For example, a remote vehicle operating in the vicinity of the autonomous vehicle can affect the operation of the autonomous vehicle and can be represented in a Markov Decision Process model, which may include representing an identified or expected geospatial location of the remote vehicle, a path identified or expected, heading, or both of the away vehicle, an identified or expected speed of the away vehicle, an identified or expected acceleration or deceleration rate of the away vehicle, or a combination thereof corresponding to the respected time location. In instantiation, the current state of the Markov Decision Process model can correspond to a contemporary condition or state of the operating environment. A respective set of states can be defined for each different vehicle operating scenario.
[0145] Although any number or cardinality of states can be used, the number or cardinality of states included in a model can be limited to a defined maximum number of states, such as 300 states. Per
Petition 870190075707, of 06/08/2019, p. 60/149
53/111 example, a model can include the 300 most likely states for a corresponding scenario.
[0146] The set of actions can include vehicle control actions available for the Markov Decision Process model in each state in the set of states. A respective set of actions can be defined for each different vehicle operating scenario.
[0147] The set of state transition probabilities can represent probabilistically potential or expected changes to the operating environment of the autonomous vehicle, as represented by the states, responsive to actions. For example, a state transition probability may indicate a probability that the autonomous vehicle's operating environment corresponds to a respective state in a respective time location immediately subsequent to a current time location corresponding to a current state in response to traversing the network. transport of vehicles by means of the autonomous vehicle from the current state according to a respective action.
[0148] The set of state transition probabilities can be identified based on the operational environment information. For example, operational environment information can indicate an area type, such as urban or rural, a time of day, an ambient light level, weather conditions, traffic conditions, which can include expected traffic conditions, such as rush hour conditions, event-related traffic congestion, or holiday-related driver behavior conditions, road conditions, jurisdictional conditions, such as country, state or municipality conditions, or any other condition or combination of conditions that may affect the operation of the autonomous vehicle.
[0149] Examples of state transition probabilities associated with a vehicle operating scenario including pedestrian may include a probability
Petition 870190075707, of 06/08/2019, p. 61/149
54/111 defined of a pedestrian crossing the road without observing the crossing rules, which can be based on a geospatial distance between the pedestrian and the respective road segment; a defined probability that a pedestrian will stop at an intersection; a defined probability that a pedestrian will cross a pedestrian crossing; a defined probability of a pedestrian giving preference to the autonomous vehicle on a pedestrian crossing; any other probability associated with a vehicle operating scenario including pedestrian.
[0150] Examples of state transition probabilities associated with a vehicle operating scenario including an intersection may include a defined probability of a vehicle away from reaching an intersection; a definite probability that a remote vehicle will overtake the autonomous vehicle; a definite probability that a remote vehicle will cross an immediately subsequent intersection, and in close proximity, to a second remote vehicle crossing the intersection, such as in the absence of a right of way (walking on the back of another); a defined probability that a vehicle away will stop, adjacent to the intersection, according to a traffic control device, regulation or other indication of right of way, before crossing the intersection; a definite probability of a vehicle away from crossing the intersection; a definite probability that a remote vehicle will deviate from an expected path near the intersection; a definite probability that a remote vehicle will deviate from an expected right of way priority; any other probability associated with a vehicle operating scenario including crossing.
[0151] Examples of state transition probabilities associated with a vehicle operating scenario including lane change may include a defined probability that a vehicle away will change speed, such as
Petition 870190075707, of 06/08/2019, p. 62/149
55/111 a definite probability of a vehicle away behind the autonomous vehicle increasing speed or a definite probability of a vehicle away in front of the autonomous vehicle decreasing speed; a defined probability of a vehicle away in front of the autonomous vehicle changing lanes; a definite probability that a vehicle away from the autonomous vehicle will change speed to allow the autonomous vehicle to move into a lane; or any other probabilities associated with a vehicle operating scenario including a change of lane.
[0152] The reward function can determine a respective positive or negative value (cost) that can be accumulated for each combination of state and action, which can represent an expected value of the autonomous vehicle crossing the vehicle transport network from the corresponding state according to the corresponding vehicle control action for the subsequent state.
[0153] The reward function can be identified based on the operating environment information. For example, operational environment information can indicate an area type, such as urban or rural, a time of day, an ambient light level, weather conditions, traffic conditions, which can include expected traffic conditions, such as rush hour conditions, event-related traffic congestion, or holiday-related driver behavior conditions, road conditions, jurisdictional conditions, such as country, state or municipality conditions, or any other condition or combination of conditions that may affect the operation of the autonomous vehicle.
[0154] One or more of the 4400 scenario-specific operational control assessment modules can be a Partially Observable Markov Decision Process (POMDP) model, which can be a single agent model. A Partially Observable Markov Decision Process Model
Petition 870190075707, of 06/08/2019, p. 63/149
56/111 may be similar to a Markov Decision Process model, except that a Partially Observable Markov Decision Process model may include modeling of uncertain states. A Partially Observable Markov Decision Process model may include modeling confidence, sensor reliability, distraction, noise, uncertainty, such as sensor uncertainty or the like. A Partially Observable Markov Decision Process model can use more processing resources and can more accurately model the distinct vehicle operating scenario than a Markov Decision Process model.
[0155] A Partially Observable Markov Decision Process model can model a distinct vehicle operating scenario using a set of states, a set of actions, a set of state transition probabilities, a reward function, a set of observations , a set of probabilities for conditional observations or a combination of them. The set of states, the set of actions, the set of state transition probabilities and the reward function can be similar to those described above in relation to the Markov Decision Process model.
[0156] The set of observations can include observations corresponding to the respective states. An observation can provide information about the attributes of a respective state. An observation can correspond with a respective temporal location. An observation can include operating environment information, such as sensor information. An observation can include expected or predicted operating environment information.
[0157] For example, a Partially Observable Markov Decision Process model may include an autonomous vehicle at a first geospatial location and a first temporal location corresponding to a first state, the model may indicate that the autonomous vehicle can identify and execute, or attempt to perform, a vehicle control action to traverse the network
Petition 870190075707, of 06/08/2019, p. 64/149
57/111 vehicle transport from the first geospatial location to a second geospatial location in a second temporal location immediately subsequent to the first temporal location, and the set of observations corresponding to the second temporal location may include the operational environment information that can be identified corresponding to the second temporal location, such as geospatial location information for the autonomous vehicle, geospatial location information for one or more external objects, availability probabilities, expected path information or the like.
[0158] The probability set of conditional observations may include probabilities of making respective observations based on the operating environment of the autonomous vehicle. For example, an autonomous vehicle can approach an intersection when crossing a first lane, at the same time, a remote vehicle can approach the intersection when crossing a second lane, the autonomous vehicle can identify and evaluate operational environment information, such as sensor, corresponding to the intersection, which may include operating environment information corresponding to the vehicle away. In some modalities, the operating environment information may be inaccurate, incomplete or erroneous. In a Markov Decision Process model, the autonomous vehicle may not probabilistically identify the remote vehicle, which may include identifying a location of the remote vehicle, an expected path for the remote vehicle or the like, and the identified information, such as the identified location of the remote vehicle, based on inaccurate operating environment information, may be inaccurate or erroneous. In a Partially Observable Markov Decision Process model the autonomous vehicle can identify information by probabilistically identifying the remote vehicle, which may include probabilistically identifying location information for the remote vehicle, such as
Petition 870190075707, of 06/08/2019, p. 65/149
58/111 location information indicating that the vehicle away may be near the intersection. The probability of conditional observation corresponding to observing, or probabilistically identifying, the location of the remote vehicle may represent the probability that the identified operating environment information accurately represents the location of the remote vehicle.
[0159] The set of probabilities for conditional observations can be identified based on operational environment information. For example, operational environment information can indicate an area type, such as urban or rural, a time of day, an ambient light level, weather conditions, traffic conditions, which can include expected traffic conditions, such as rush hour conditions, event-related traffic congestion, or holiday-related driver behavior conditions, road conditions, jurisdictional conditions, such as country, state or municipality conditions, or any other condition or combination of conditions that may affect the operation of the autonomous vehicle.
[0160] In some modalities, such as modalities implementing a Partially Observable Markov Decision Process model, modeling an autonomous vehicle operational control scenario may include modeling obstructions. For example, operating environment information may include information corresponding to one or more obstructions, such as sensor obstructions, in the operating environment of the autonomous vehicle in such a way that the operating environment information may omit information representing one or more external objects obstructed in the operating environment of the autonomous vehicle. For example, an obstruction may be an external object, such as a traffic light, a building, a tree, an identified external object, or any other operating condition or combination of operating conditions capable of obstructing one or more other operating conditions, such as as external objects, of the autonomous vehicle in a
Petition 870190075707, of 06/08/2019, p. 66/149
59/111 defined spatio-temporal location. In some embodiments, a 4300 operating environment monitor can identify obstructions, can identify or determine a probability that an external object is obstructed, or hidden, by an identified obstruction, and can include information of vehicle obstruction probability in the operating environment information sent to the 4100 autonomous vehicle operational management controller, and provided, by the 4100 autonomous vehicle operational management controller, to the respective 4400 scenario-specific operational control assessment modules.
[0161] In some embodiments, one or more of the 4400 scenario-specific operational control assessment modules may be a Decentralized Partially Observable Markov Decision Process (Dec-POMDP) model, which may be a multi-agent model, and that can model a different vehicle operating scenario. A Decentralized Partially Observable Markov Decision Process model may be similar to a Partially Observable Markov Decision Process model except that a Partially Observable Markov Decision Process model can model the autonomous vehicle and a subset, such as one, of external objects and a Decentralized Partially Observable Markov Decision Process model can model the autonomous vehicle and the set of external objects.
[0162] In some modalities, one or more of the 4400 scenario-specific operational control assessment modules can be a Partially Observable Stochastic Game (POSG) model, which can be a multi-agent model, and which can model a scenario distinct vehicle operating. A Partially Observable Stochastic Game model may be similar to a Decentralized Partially Observable Markov Decision Process model except that the Decentralized Partially Observable Markov Decision Process model may include a reward function for the autonomous vehicle and
Petition 870190075707, of 06/08/2019, p. 67/149
60/111 the Partially Observable Stochastic Game model may include the reward function for the autonomous vehicle and a respective reward function for each external object.
[0163] In some modalities, one or more of the 4400 scenario-specific operational control assessment modules can be a Reinforcement Learning (RL) model, which can be a learning model, and which can model an operational scenario of distinct vehicle. A Reinforcement Learning model can be similar to a Markov Decision Process model or a Partially Observable Markov Decision Process model except that defined state transition probabilities, observation probabilities, reward function, or any combination can be omitted from the model.
[0164] In some modalities, a Reinforcement Learning model may be a model-based Reinforcement Learning model, which may include generating state transition probabilities, observation probabilities, a reward function, or any combination thereof based on one or more modeled or observed events.
[0165] In a Reinforcement Learning model, the model can evaluate one or more events or interactions, which can be simulated events, such as crossing an intersection, crossing a vehicle transport network near a pedestrian or changing lanes bearing, and can generate, or modify, a corresponding model, or a solution thereof, in response to the respective event. For example, the autonomous vehicle can cross an intersection using a Reinforcement Learning model. The Reinforcement Learning model can indicate a candidate vehicle control action to cross the intersection. The autonomous vehicle can cross the intersection using the candidate vehicle control action as the vehicle control action for a temporal location. O
Petition 870190075707, of 06/08/2019, p. 68/149
61/111 autonomous vehicle can determine a result of crossing the intersection using the candidate vehicle control action, and can update the model based on the result.
[0166] In an example, in a first temporal location, a remote vehicle may be stationary at an intersection with an indication of a prohibited right of way, such as a red light, the Reinforcement Learning model may indicate a vehicle control action candidate 'proceed' to the first temporal location, the Reinforcement Learning model may include a likelihood of identifying operational environment information at a subsequent temporal location, subsequent to crossing the vehicle transport network according to the vehicle control action candidate identified, indicating that a geospatial location of the remote vehicle corresponding to the first temporal location differs from a geospatial location of the remote vehicle corresponding to the second temporal location is low, such as 0/100. The autonomous vehicle can cross the vehicle transport network according to the identified candidate vehicle control action, subsequently it can determine that the geospatial location of the remote vehicle corresponding to the first temporal location differs from the geospatial location of the remote vehicle corresponding to the second temporal location , and can modify, or update, the probability accordingly to incorporate the identified event, such as for 1/101.
[0167] In another example, the Reinforcement Learning model can indicate a positive expected reward defined for crossing the vehicle transport network from a first temporal location to a second temporal location according to an identified vehicle control action and according to identified operating environment information, which can be probabilistic. The autonomous vehicle can cross the vehicle transport network
Petition 870190075707, of 06/08/2019, p. 69/149
62/111 according to the vehicle control action identified. The autonomous vehicle can determine, based on subsequently identified operating environment information, which may be probabilistic, that the operating environment information corresponding to the second time location is substantially similar to the identified operating environment information corresponding to the first time location, which it can indicate a cost, as in time, of crossing the vehicle transport network according to the identified vehicle control action, and the Reinforcement Learning model can reduce the corresponding expected reward.
[0168] The autonomous vehicle operational management system 4000 can include any number or combination of model types. For example, the 4410 pedestrian scenario specific operational control evaluation module, the 4420 crossing scenario specific operational control evaluation module and the 4430 lane change scenario specific operational control evaluation module can be Partially Observable Markov Decision Process models. In another example, the 4410 pedestrian scenario specific operational control evaluation module can be a Markov Decision Process model and the 4420 crossing scenario specific operational control evaluation module and the operational control evaluation module 4430 lane change scenario specifics can be Partially Observable Markov Decision Process models.
[0169] The 4100 autonomous vehicle operational management controller can instantiate any number of 4400 scenario-specific operational control assessment modules based on the operating environment information.
[0170] For example, operating environment information may include information representing a pedestrian approaching an intersection along
Petition 870190075707, of 06/08/2019, p. 70/149
63/111 of an expected path for the autonomous vehicle, and the 4100 autonomous vehicle operational management controller can identify a vehicle operating scenario including pedestrian, a vehicle operating scenario including crossing or both. The 4100 autonomous vehicle operational management controller can instantiate a 4410 pedestrian scenario specific operational control evaluation module instance, an 4420 crossing scenario specific operational control evaluation module instance, or both.
[0171] In another example, operating environment information may include information representing more than one pedestrian at or near an intersection along an expected path for the autonomous vehicle. The 4100 autonomous vehicle operational management controller can identify pedestrian operating scenarios corresponding to one or more pedestrians, a vehicle operating scenario including crossing or a combination thereof. The 4100 autonomous vehicle operational management controller can instantiate instances of the 4410 pedestrian scenario-specific operational control assessment module for all or some of the pedestrian operational scenarios, an instance of the 4420 crossing-specific operational control assessment module or a combination of them.
[0172] The 4410 pedestrian scenario specific operational control assessment module can be a model of an autonomous vehicle operational control scenario that includes the autonomous vehicle crossing a part of the vehicle transport network close to a pedestrian (traffic scenario). pedestrian). The 4410 pedestrian scenario specific operational control assessment module can receive operating environment information, such as the pedestrian information generated by the 4310 operating environment monitor to monitor pedestrians, from the 4100 autonomous vehicle operational management controller.
[0173] The scenario-specific operational control assessment module
Petition 870190075707, of 06/08/2019, p. 71/149
64/111 pedestrian 4410 can model pedestrian behavior corresponding to the pedestrian crossing a part of the vehicle transport network or probabilistically affecting the autonomous vehicle's operation otherwise. In some embodiments, the 4410 pedestrian scenario specific operational control assessment module may model a pedestrian as acting in accordance with pedestrian model rules expressing likely pedestrian behavior. For example, pedestrian model rules may express vehicle transport network regulations, pedestrian transport network regulations, predicted pedestrian behavior, sociable norms, or a combination thereof. For example, pedestrian model rules may indicate a likelihood that a pedestrian may cross a portion of the vehicle transport network at a pedestrian crossing strip or other defined pedestrian access area. In some embodiments, the 4410 pedestrian scenario specific operational control assessment module can model a pedestrian as acting independently of vehicle transport network regulations, defined pedestrian transportation network regulations, or both, such as when crossing a pedestrian transportation network. via without observing the crossing rules.
[0174] The 4410 pedestrian scenario specific operational control assessment module can produce a candidate vehicle control action, such as a 'stop' candidate vehicle control action, a 'forward' candidate vehicle control action, or a candidate vehicle control action 'proceed'. In some embodiments, the candidate vehicle control action may be a composite vehicle control action. For example, the candidate vehicle control action may include an 'advance' vehicle control action, which may be a communication vehicle control action with an indirect signaling pedestrian, and may include a vehicle control action of communication with direct signal pedestrian, such as flashing headlights of the autonomous vehicle or sounding the vehicle horn
Petition 870190075707, of 06/08/2019, p. 72/149
65/111 autonomous. An example of an autonomous vehicle operational control scenario that includes the autonomous vehicle crossing a part of the vehicle transport network close to a pedestrian is shown in figure 7.
[0175] The 4420 intersection scenario specific operational control assessment module can be a model of an autonomous vehicle operational control scenario that includes the autonomous vehicle crossing a part of the vehicle transport network that includes an intersection. The 4420 intersection scenario-specific operational control assessment module can model the behavior of remote vehicles traversing an intersection in the vehicle transport network or probabilistically affecting the autonomous vehicle's operation traversing the intersection otherwise. An intersection can include any part of the vehicle transport network where a vehicle can move from one track to another.
[0176] In some modalities, modeling an autonomous vehicle operational control scenario that includes the autonomous vehicle crossing a part of the vehicle transport network that includes an intersection may include determining an order of right of way for vehicles crossing the intersection, such as as when negotiating with remote vehicles.
[0177] In some modalities, modeling an autonomous vehicle operational control scenario that includes the autonomous vehicle crossing a part of the vehicle transport network that includes an intersection may include modeling one or more traffic controls, such as a stop signal , a preference signal, a traffic signal, or any other device, regulation and traffic control signal or combination thereof.
[0178] In some modalities, modeling an autonomous vehicle operational control scenario that includes the autonomous vehicle crossing a part of the vehicle transport network that includes an intersection may include producing an action of
Petition 870190075707, of 06/08/2019, p. 73/149
66/111 candidate vehicle control 'advance', receive information, such as sensor information, in response to the autonomous vehicle executing the candidate vehicle control action 'advance', and produce a subsequent candidate vehicle control action based on Information received.
[0179] In some embodiments, modeling an autonomous vehicle operational control scenario that includes the autonomous vehicle traversing a part of the vehicle transport network that includes an intersection may include modeling a probability that a remote vehicle may cross the intersection accordingly with vehicle transport network regulations. In some embodiments, modeling an autonomous vehicle operational control scenario that includes the autonomous vehicle crossing a portion of the vehicle transport network that includes an intersection may include modeling a probability that a remote vehicle may cross the intersection regardless of one or more vehicle transport network regulations, such as when following immediately behind or being loaded by another vehicle away with a right of way.
[0180] The 4420 intersection scenario specific operational control assessment module can produce a candidate vehicle control action, such as a 'stop' candidate vehicle control action, a 'forward' candidate vehicle control action, or a candidate vehicle control action 'proceed'. In some embodiments, the candidate vehicle control action may be a composite vehicle control action. For example, the candidate vehicle control action may include a ‘proceed’ vehicle control action and a signaling communication vehicle control action, such as lighting an autonomous vehicle direction arrow. An example of an autonomous vehicle operational control scenario that includes the autonomous vehicle crossing an intersection is shown in figure 8.
[0181] The scenario-specific operational control assessment module
Petition 870190075707, of 06/08/2019, p. 74/149
Lane Change 67/111 4430 can be a model of an autonomous vehicle operational control scenario that includes the autonomous vehicle traversing a portion of the vehicle transport network when performing a lane change operation. The 4430 lane change scenario specific operational control assessment module can model the behavior of remote vehicles probabilistically affecting the operation of the autonomous vehicle by performing lane change.
[0182] In some modalities, modeling an autonomous vehicle operational control scenario that includes the autonomous vehicle traversing a part of the vehicle transport network when performing a lane change may include producing a candidate vehicle control action 'maintain ', a vehicle control action' proceed ', a vehicle control action' accelerate ', a vehicle control action' decelerate 'or a combination thereof. An example of an autonomous vehicle operational control scenario that includes the autonomous vehicle changing lanes is shown in figure 9.
[0183] In some embodiments, one or more of the 4100 autonomous vehicle operational management controller, the 4200 lock monitor, the 4300 operating environment monitors or the 4400 scenario-specific operational control assessment modules can operate continuously or continuously. periodic mode, such as at a frequency of ten hertz (10 Hz). For example, the 4100 autonomous vehicle operational management controller can identify a vehicle control action many times, such as ten times, per second. The operating frequency of each component of the 4000 autonomous vehicle operational management system can be synchronized or unsynchronized, and the operating rate of one or more of the 4100 autonomous vehicle operational management controller, the 4200 lock monitor, 4300 operating environment or the specific operational control assessment modules for
Petition 870190075707, of 06/08/2019, p. 75/149
68/111 scenarios 4400 can be independent of the operating rate of another one or more of the 4100 autonomous vehicle operational management controller, the 4200 lock monitor, the 4300 operating environment monitors or the specific operational control evaluation modules 4400 scenarios.
[0184] In some modalities, candidate vehicle control actions produced by instances of 4400 scenario-specific operational control assessment modules may include or be associated with operational environment information, such as status information, time information, or both . For example, a candidate vehicle control action can be associated with operational environment information representing a possible future state, a future temporal location, or both. In some embodiments, the 4100 autonomous vehicle operational management controller can identify obsolete candidate vehicle control actions representing past time locations, states having a probability of occurring below a minimum threshold, or unselected candidate vehicle control actions, and delete, omit or ignore obsolete candidate vehicle control actions.
[0185] Figure 5 is a flow diagram of an example of an autonomous vehicle operational management 5000 according to the modalities of this disclosure. Operational management of autonomous vehicle 5000 can be implemented in an autonomous vehicle, such as vehicle 1000 shown in figure 1, one of the 2100/2110 vehicles shown in figure 2, a semi-autonomous vehicle or in any other vehicle implementing autonomous driving. For example, an autonomous vehicle can implement an autonomous vehicle operational management system, such as the autonomous vehicle operational management system 4000 shown in figure 4.
[0186] The autonomous vehicle operational management 5000 may include implementing or operating one or more modules or components, which may include
Petition 870190075707, of 06/08/2019, p. 76/149
69/111 operate a 5100 autonomous vehicle operational management controller or executor, such as the 4100 autonomous vehicle operational management controller shown in figure 4; a lock monitor 5200, such as lock monitor 4200 shown in figure 4; zero or more instances of scenario specific operational control assessment modules (SSOCEMI) 5300, such as instances of scenario specific 4400 operational control assessment modules shown in figure 4; or a combination of them.
[0187] Although not shown separately in figure 5, in some modalities, the 5100 performer can monitor the operating environment of the autonomous vehicle, or defined aspects of it. In some modalities, monitoring the operating environment of the autonomous vehicle may include identifying and tracking external objects in 5110, identifying distinct vehicle operating scenarios in 5120, or a combination thereof.
[0188] The 5100 performer can identify an operating environment, or an aspect thereof, of the autonomous vehicle in 5110. Identifying the operating environment may include identifying operating environment information representing the operating environment, or one or more aspects of it. In some embodiments, operating environment information may include vehicle information for the autonomous vehicle, information representing the vehicle transport network, or one or more aspects of it, close to the autonomous vehicle, information representing external objects, or one or more aspects of them, within the operating environment of the autonomous vehicle or a combination thereof.
[0189] In some embodiments, the 5100 performer can identify the operating environment information in 5110 based on sensor information, vehicle transport network information, previously identified operating environment information or any other information or combination of information describing an aspect or aspects of the operating environment. In
Petition 870190075707, of 06/08/2019, p. 77/149
70/111 In some embodiments, the sensor information can be processed sensor information, such as sensor information processed by an autonomous vehicle sensor information processing unit, which can receive sensor information from the autonomous vehicle sensor and can generate the processed sensor information based on the sensor information.
[0190] In some embodiments, identifying the operating environment information in 5110 may include receiving information indicating one or more aspects of the operating environment of an autonomous vehicle sensor, such as the 1360 sensor shown in figure 1 or the 2105 vehicle sensors shown in figure 2. For example, the sensor can detect an external object such as a pedestrian, a vehicle, or any other object external to the autonomous vehicle, within a defined distance, such as 300 meters, from the autonomous vehicle, and the sensor can send sensor information indicating or representing the external object to the 5100 performer. In some modalities, the sensor, or another unit of the autonomous vehicle, can store the sensor information in a memory, such as the 1340 memory shown in figure 1, of the autonomous vehicle and the 5100 autonomous vehicle operational management controller can read the sensor information in memory.
[0191] In some embodiments, the external object indicated by the sensor information can be indeterminate, and the 5100 autonomous vehicle operational management controller can identify object information, such as an object type, based on the sensor information, other information, such as information from another sensor, information corresponding to an object previously identified or in a combination thereof. In some embodiments, the sensor, or another unit of the autonomous vehicle, can identify the object information and can send the object identification information to the 5100 autonomous vehicle operational management controller.
Petition 870190075707, of 06/08/2019, p. 78/149
71/111 [0192] In some modalities, the sensor information can indicate a track condition, a track feature or a combination of them. For example, the sensor information can indicate a track condition, such as a wet track condition, a frozen track condition, or any other track condition or conditions. In another example, the sensor information may indicate lane signals, such as a lane line, a track geometry aspect, or any other track features or features.
[0193] In some modalities, identifying operating environment information in 5110 may include identifying information indicating one or more aspects of the operating environment from vehicle transport network information. For example, the 5100 autonomous vehicle operational management controller can read, or otherwise receive, vehicle transport network information indicating that the autonomous vehicle is approaching an intersection, or otherwise describe a geometry or configuration of the vehicle. vehicle transport network close to the autonomous vehicle, such as within 300 meters of the autonomous vehicle.
[0194] In some embodiments, identifying operating environment information in 5110 may include identifying information indicating one or more aspects of the operating environment of a remote vehicle or other remote device external to the autonomous vehicle. For example, the autonomous vehicle may receive, from a remote vehicle, via a wireless electronic communication link, a vehicle away message including vehicle away information indicating vehicle geospatial status information for the remote vehicle, vehicle information. kinematic state of vehicle away to vehicle away, or both.
[0195] In some embodiments, the 5100 executor may include one or more instances of scenario-specific monitoring modules. For example, the executor
Petition 870190075707, of 06/08/2019, p. 79/149
72/111
5100 can include a scenario-specific monitor module instance to monitor pedestrians, a scenario-specific monitor module instance to monitor intersections, a scenario-specific monitor module instance to monitor lane changes or a combination thereof. Each scenario-specific monitor module instance can receive, or otherwise access, operating environment information corresponding to the respective scenario, and can send, store or otherwise produce for access by executor 5100, the lock monitor 5200, the instance 5300 scenario-specific operational control evaluation module, or a combination of them specialized to monitor information corresponding to the respective scenario.
[0196] In some embodiments, the 5100 performer may send the operating environment information representing an operating environment to the autonomous vehicle to the lock monitor 5200 in 5112. Alternatively, or in addition, the lock monitor 5200 can receive the lock information. operating environment representing an operating environment for the autonomous vehicle of another component of the autonomous vehicle, such as an autonomous vehicle sensor, the locking monitor 5200 can read the operating environment information representing an operating environment for the autonomous vehicle in a memory autonomous vehicle or a combination thereof.
[0197] The 5100 performer can detect or identify one or more distinct vehicle operating scenarios in 5120. In some embodiments, the 5100 performer can detect 5120 distinct vehicle operating scenarios based on one or more aspects of the operating environment represented by the information operating environment identified in 5110.
[0198] In some embodiments, the 5100 performer can identify multiple distinct vehicle operating scenarios, which may be aspects of a 5120 composite vehicle operating scenario. For example,
Petition 870190075707, of 06/08/2019, p. 80/149
73/111 operating environment may include information representing a pedestrian approaching an intersection along an expected path for the autonomous vehicle, and performer 5100 may identify a vehicle operating scenario including pedestrian, a vehicle operating scenario including crossing, or both in 5120. In another example, the operating environment represented by the operating environment information can include multiple external objects and the 5100 performer can identify a distinct vehicle operating scenario corresponding to each external object in 5120.
[0199] The 5100 performer can instantiate an instance of 5300 scenario-specific operational control assessment module based on one or more aspects of the operating environment represented by the 5130 operating environment information. For example, the 5100 performer can instantiate the instance of a 5300 scenario-specific operational control assessment module in 5130 in response to identifying a distinct vehicle operating scenario in 5120.
[0200] Although one instance of 5300 scenario-specific operational control assessment module is shown in Figure 5, the 5100 performer can instantiate multiple instances of 5300 scenario-specific operational control assessment modules based on one or more aspects of the environment operational represented by the operating environment information identified in 5110, each instance of 5300 scenario specific operational control assessment module corresponding to a respective distinct vehicle operating scenario detected in 5120, or a combination of a distinct external object identified in 5110 and a respective operating scenario of distinct vehicle detected in 5120.
[0201] For example, the operating environment represented by the operating environment information identified in 5110 can include multiple external objects, the 5100 performer can detect multiple different vehicle operating scenarios, which can be aspects of a composite vehicle operating scenario, in 5120
Petition 870190075707, of 06/08/2019, p. 81/149
ΑΙΑ 11 based on the operating environment represented by the operating environment information identified in 5110, and executor 5100 can instantiate a 5300 scenario-specific operational control assessment module instance corresponding to each distinct combination of a distinct vehicle operating scenario and one external object.
[0202] In some modalities, a scenario-specific operational control assessment module corresponding to the distinct vehicle operating scenario identified in 5120 may be unavailable and instantiating a scenario-specific 5300 operational control assessment module instance in 5130 may include generating , resolving and instantiating a 5300 instance of a scenario specific operational control assessment module corresponding to the distinct vehicle operating scenario identified in 5120. For example, the distinct vehicle operating scenario identified in 5120 may indicate an intersection including two lanes having stop traffic control signals, such as stop signals, and two lanes having traffic control signals of preference, such as preference signals, usable scenario-specific operational control assessment modules may include a operational control evaluation module and Partially Observable Markov Decision Process scenario specific that differs from the distinct vehicle operational scenario identified in 5120, such as a Partially Observable Markov Decision Process scenario specific operational control assessment module that models a crossover scenario including four bearing lanes having stop traffic control signals, and the 5100 performer can generate, resolve and instantiate a 5300 instance of a Markov Decision Process scenario-specific operational control assessment module by modeling an intersection including two lanes of bearing bearing traffic control signals and two bearing lanes bearing traffic control signals to give preference in
Petition 870190075707, of 06/08/2019, p. 82/149
75/111
5130.
[0203] In some modalities, instantiating an instance of a 5130 scenario-specific operational control assessment module may include identifying a probability of spatio-temporal convergence convergence based on information about the autonomous vehicle, on operational environment information or in a combination of them. Identifying a spatio-temporal convergence probability may include identifying an expected path for the autonomous vehicle, identifying an expected path for the autonomous vehicle, and identifying a convergence probability for the autonomous vehicle and the remote vehicle indicating a probability that the autonomous vehicle and the remote vehicle may converge or collide based on the expected path information. The instance of a scenario-specific operational control assessment module can be instantiated in response to determining that the likelihood of convergence exceeds a defined threshold, such as a defined maximum acceptable convergence probability.
[0204] In some embodiments, instantiating instances of 5300 scenario-specific operational control assessment modules in 5130 may include sending the operating environment information representing an operating environment to the autonomous vehicle to the instances of specific operational control assessment modules 5300 scenarios, as indicated in 5132.
[0205] The 5300 scenario-specific operational control assessment module instance can receive operational environment information representing an operating environment for the autonomous vehicle, or one or more aspects of it, in 5310. For example, the module instance scenario-specific operational control assessment system 5300 may receive the operating environment information representing an operating environment for the autonomous vehicle, or one or more aspects of it, sent by executor 5100 at 5132. Alternatively,
Petition 870190075707, of 06/08/2019, p. 83/149
76/111 or in addition, instances of 5300 scenario-specific operational control assessment modules can receive operating environment information representing an operating environment for the autonomous vehicle from another component of the autonomous vehicle, such as a vehicle sensor autonomous or blocking monitor 5200, instances of 5300 scenario-specific operational control assessment modules can read the operational environment information representing an operating environment for the autonomous vehicle in an autonomous vehicle memory or a combination thereof.
[0206] Lockout monitor 5200 can receive operating environment information representing an operating environment, or an aspect of it, for the 5210 autonomous vehicle. For example, lockout monitor 5200 can receive operating environment information, or an aspect of it, sent by executor 5100 in 5112. In some embodiments, the lock monitor 5200 can receive the operational environment information, or an aspect of it, from an autonomous vehicle sensor, from an external device, such as an vehicle or an infrastructure device, or a combination thereof. In some embodiments, the lock monitor 5200 can read the operating environment information, or an aspect of it, in a memory, such as an autonomous vehicle memory.
[0207] The blocking monitor 5200 can determine a respective availability probability (POA), or corresponding blocking probability, in 5220 stops one or more parts of the vehicle transport network, such as parts of the vehicle transport network proximal to the autonomous vehicle, which may include parts of the vehicle transport network corresponding to an expected path of the autonomous vehicle, such as an expected path identified based on a current route of the autonomous vehicle.
[0208] In some modalities, determine the respective probability of
Petition 870190075707, of 06/08/2019, p. 84/149
77/111 availability in 5220 can include identifying external objects, tracking external objects, projecting location information to external objects, projecting path information to external objects or a combination of them. For example, the lock monitor 5200 can identify an external object and can identify an expected path to the external object, which can indicate a sequence of expected spatial locations, expected temporal locations, and corresponding probabilities.
[0209] In some embodiments, the lock monitor 5200 can identify the expected path to an external object based on information from the operating environment, such as information indicating a current location of the external object, information indicating a current path for the external object, information indicating a type of classification of the external object, such as information classifying the external object as a pedestrian or remote vehicle, vehicle transport network information, such as information indicating that the vehicle transport network includes an intersection lane for pedestrians near the external object, previously identified or tracked information associated with the external object, or any combination thereof. For example, the external object can be identified as a vehicle away, and the expected path for the vehicle away can be identified based on information indicating a current location of the vehicle away, information indicating a current path of the vehicle away, information indicating a speed current of the vehicle away, vehicle transport network information corresponding to the vehicle away, legal or regulatory information or a combination thereof.
[0210] In some embodiments, the lock monitor 5200 can send the availability probabilities identified in 5220 to instances of 5300 scenario-specific operational control assessment modules in 5222. Alternatively, or in addition, the lock monitor 5200 can store the
Petition 870190075707, of 06/08/2019, p. 85/149
78/111 availability probabilities identified in 5220 in an autonomous vehicle memory, or a combination thereof. Although not shown expressly in figure 5, blocking monitor 5200 can send the availability probabilities identified in 5220 to executor 5100 in 5212 in addition to, or as an alternative to, send the availability probabilities to instances of control assessment modules 5300 scenario-specific operating conditions.
[0211] The 5300 scenario-specific operational control assessment module instance can receive the availability probabilities at 5320. For example, the 5300 scenario-specific operational control assessment module instance can receive the availability probabilities sent by the monitor block 5200 in 5222. In some embodiments, the 5300 scenario-specific operational control evaluation module instance can read the availability probabilities in a memory, such as an autonomous vehicle memory.
[0212] The 5300 scenario-specific operational control assessment module instance can resolve a corresponding distinct vehicle operating scenario model in 5330. In some embodiments, the 5300 scenario-specific operational control assessment module instance can generate or identify a candidate vehicle control action in 5330.
[0213] In some embodiments, the 5300 scenario-specific operational control assessment module instance may send the candidate vehicle control action identified in 5330 to executor 5100 in 5332. Alternatively, or in addition, the instance of the control module Scenario-specific operational control assessment 5300 can store the candidate vehicle control action identified in 5330 in an autonomous vehicle memory.
[0214] The 5100 performer can receive a candidate vehicle control action at 5140. For example, the 5100 performer can receive the control action
Petition 870190075707, of 06/08/2019, p. 86/149
79/111 candidate vehicle of the 5300 scenario-specific operational control assessment module instance in 5140. Alternatively, or in addition, the 5100 performer can read the candidate vehicle control action in an autonomous vehicle memory.
[0215] Executor 5100 may approve the candidate vehicle control action, or otherwise identify the candidate vehicle control action as a vehicle control action to control the autonomous vehicle to cross the vehicle transport network, in 5150 For example, executor 5100 can identify a distinct vehicle operating scenario in 5120, instantiate a 5300 scenario-specific operational control assessment module instance in 5130, receive a candidate vehicle control action in 5140, and can approve the candidate vehicle control action in 5150.
[0216] In some embodiments, the 5100 performer can identify multiple distinct vehicle operating scenarios in 5120, instantiate multiple instances of 5300 scenario-specific operational control assessment modules in 5130, receive multiple candidate vehicle control actions in 5140, and may approve one or more of the candidate vehicle control actions in 5150. In addition, or alternatively, the autonomous vehicle operational management 5000 may include operating one or more instances of scenario-specific operational control assessment modules previously instantiated (not expressly shown), and the executor can receive vehicle control actions candidate in 5140 from the instance of scenario-specific operational control assessment module instantiated in 5130 and from one or more of the instances of scenario-specific operational control assessment modules previously, and can approve one or more of the cont actions roll of vehicle candidates in 5150.
[0217] Approving a candidate vehicle control action in 5150 may include determining whether to cross a part of the vehicle transport network
Petition 870190075707, of 06/08/2019, p. 87/149
80/111 according to the candidate vehicle control action.
[0218] The 5100 performer can control the autonomous vehicle to cross the vehicle transport network, or a part of it, in 5160 according to the vehicle control action identified in 5150.
[0219] The 5100 performer can identify an operating environment, or an aspect of it, of the autonomous vehicle in 5170. Identifying an operating environment, or an aspect of it, of the autonomous vehicle in 5170 can be similar to identifying the operating environment of the vehicle stand-alone in 5110 and may include updating previously identified operating environment information.
[0220] The 5100 performer can determine or detect whether a distinct vehicle operating scenario is resolved or not resolved in 5180. For example, the 5100 performer can receive operational environment information either continuously or on a periodic basis, as previously described. The 5100 performer can evaluate the operating environment information to determine whether the distinct vehicle operating scenario is resolved.
[0221] In some embodiments, the 5100 performer may determine that the distinct vehicle operating scenario corresponding to the 5300 scenario-specific operational control assessment module instance is not resolved in 5180, the 5100 performer may send the identified operating environment information in 5170 for instances of 5300 scenario-specific operational control assessment modules as indicated in 5185, and not instantiating the scenario-specific 5300 operational control assessment module instances in 5180 can be omitted or deferred.
[0222] In some modalities, executor 5100 may determine that the distinct vehicle operating scenario is resolved in 5180 and may not instantiate in 5190 instances of 5300 scenario-specific operational control assessment modules corresponding to the distinct vehicle operating scenario
Petition 870190075707, of 06/08/2019, p. 88/149
81/111 determined to be resolved in 5180. For example, executor 5100 can identify a distinct set of operating conditions forming the distinct vehicle operating scenario for the autonomous vehicle in 5120, can determine that one or more of the operating conditions has expired, or have a probability of affecting the operation of the autonomous vehicle below a defined threshold in 5180, and may not instantiate the corresponding scenario specific operational control assessment module 5300 instance.
[0223] Although not shown expressly in figure 5, the 5100 performer may continuously or periodically repeat identifying or updating the operating environment information in 5170, determine whether the distinct vehicle operating scenario is resolved in 5180, and, in response to determine that the distinct vehicle operating scenario is not resolved in 5180, send the operating environment information identified in 5170 to instances of 5300 scenario-specific operational control assessment modules as indicated in 5185, and determine whether the operational scenario of distinct vehicle is resolved in 5180 includes determining that the distinct vehicle operating scenario is resolved.
[0224] Figure 6 is a diagram of an example of a 6000 lock scene according to the modalities of this disclosure. Autonomous vehicle operational management, such as the autonomous vehicle operational management 5000 shown in figure 5, may include an autonomous vehicle 6100, such as vehicle 1000 shown in figure 1, one of the 2100/2110 vehicles shown in figure 2, a vehicle semi-autonomous, or any other vehicle implementing autonomous driving, operating an autonomous vehicle operational management system, such as the autonomous vehicle operational management system 4000 shown in figure 4 including a lock monitor, such as the lock monitor 4200 shown in figure 4 or the blocking monitor 5200 shown in figure 5, to determine an availability probability, or a corresponding blocking probability,
Petition 870190075707, of 06/08/2019, p. 89/149
82/111 for a part or area of a vehicle transport network corresponding to the blocking scene 6000. The blocking monitor can operate, and availability probabilities can be determined, in combination with defined autonomous vehicle operational control scenarios or independently of them.
[0225] The part of the vehicle transport network corresponding to the blocking scene 6000 shown in figure 6 includes the autonomous vehicle 6100 traveling on a first track 6200, approaching an intersection 6210 with a second track 6220. Crossing 6210 includes a pedestrian crossing lane 6300. A pedestrian crossing 6400 is approaching the pedestrian crossing lane 6300. A vehicle away 6500 is traveling on duplicate 6220 approaching intersection 6210. An expected path 6110 for autonomous vehicle 6100 indicates that the vehicle autonomous 6100 can cross intersection 6210 by turning right from first lane 6200 to duplicate 6220. An expected alternative path 6120 for autonomous vehicle 6100, shown using a dashed line, indicates that autonomous vehicle 6100 can cross intersection 6210 when turning left from the first lane 6200 to the second lane 6220.
[0226] The blocking monitor can identify an expected 6410 path for the 6400 pedestrian. For example, sensor information can indicate that the 6400 pedestrian has a speed exceeding a threshold and a trajectory crossing the 6300 pedestrian crossing range, information of vehicle transport network may indicate that the intersection includes regulatory controls in such a way that crossing the intersection is in accordance with regulatory controls with vehicles giving preference to pedestrians in the pedestrian crossing lane, or intersection 6210 may include one or more traffic control devices (not shown) indicating a permitted right-of-way signal for pedestrian 6400, and the expected path 6410 for pedestrian 6400 can be identified as including pedestrian 6400 crossing the 6300 pedestrian crossing
Petition 870190075707, of 06/08/2019, p. 90/149
83/111 with a high probability, such as 1.0 or 100%.
[0227] The blocking monitor can identify the expected paths 6510, 6520 for the vehicle 6500 away. For example, sensor information can indicate that the vehicle 6500 is approaching intersection 6210, vehicle transport network information can indicate that the vehicle 6500 can cross straight through intersection 6210 or can turn right at intersection 6210 onto first lane 6200, and the blocking monitor can identify a first expected path 6510 straight through the intersection, and a second expected path 6520 turning right at the intersection for the 6500 away vehicle.
[0228] In some embodiments, the blocking monitor can identify a probability for each of the expected paths 6510, 6520 based, for example, on operating information for the vehicle away 6500. For example, the operating information for the vehicle away 6500 can indicate a speed for the away vehicle that exceeds a maximum turning threshold, and the first expected path 6510 can be identified with a high probability, such as 0.9 or 90%, and the second expected path 6520 can be identified with a low probability, such as 0.1 or 10%.
[0229] In another example, the operating information for the away vehicle 6500 can indicate a speed for the away vehicle that is within the maximum turning threshold, and the first expected path 6510 can be identified with a low probability, such as 0.1 or 10%, and the expected second path 6520 can be identified with a high probability, such as 0.9 or 90%.
[0230] The blocking monitor can identify a probability of availability for the part or area of the 6220 duplicate near, for example, within a few feet (centimeters), such as three (91.44 centimeters), of the expected path 6410 of the pedestrian, which can correspond with the 6300 pedestrian crossing range, very low, such as 0%, indicating that the corresponding part
Petition 870190075707, of 06/08/2019, p. 91/149 of duplicate 6220 is blocked for a period corresponding to pedestrian 6400 crossing the pedestrian crossing strip 6300.
[0231] The blocking monitor can determine that the first expected path 6510 for the vehicle 6500 and the expected path of the autonomous vehicle 6100 are blocked by the competing pedestrian with the time period corresponding to the 6400 pedestrian crossing the 6300 pedestrian crossing strip.
[0232] Figure 7 is a diagram of an example of a 7000 pedestrian scene including pedestrian scenarios according to the modalities of this disclosure. Autonomous vehicle operational management, such as the autonomous vehicle operational management 5000 shown in figure 5, can include an autonomous vehicle 7100, such as vehicle 1000 shown in figure 1, one of the 2100/2110 vehicles shown in figure 2, a vehicle semi-autonomous, or any other vehicle implementing autonomous driving, operating an autonomous vehicle operational management system, such as the autonomous vehicle operational management system 4000 shown in figure 4, including a scenario-specific operational control assessment module instance pedestrian, which can be an instance of a pedestrian scenario specific operational control evaluation module, such as the pedestrian scenario specific operational control evaluation module shown in figure 4, which can be a model of a autonomous vehicle operational control scenario that includes autonomous vehicle 7100 crossing a part of the network vehicle transport near a pedestrian. For simplicity and clarity, the part of the vehicle transport network corresponding to the 7000 pedestrian scene shown in figure 7 is oriented with north at the top and east at the right.
[0233] The part of the vehicle transport network corresponding to the pedestrian scene 7000 shown in figure 7 includes the autonomous vehicle 7100 crossing
Petition 870190075707, of 06/08/2019, p. 92/149
85/111 to the north along a lane segment on a first lane lane 7200, approaching a 7210 intersection with a 7220 duplicate. The 7210 intersection includes a first 7300 pedestrian crossing via the first lane 7200, and a second crosswalk for pedestrians 7310 through second lane 7220. A first pedestrian 7400 is on the first lane 7200 moving eastwards in a non-pedestrian access area (without observing the crossing rules). A second 7410 pedestrian is near the first pedestrian crossing lane 7300 and is moving west. A third 7420 pedestrian is approaching the first 7300 pedestrian crossing lane in the west. A 7430 pedestrian room is approaching the second 7310 pedestrian crossing from the north.
[0234] The autonomous vehicle operational management system may include an autonomous vehicle operational management controller, such as the 4100 autonomous vehicle operational management controller shown in figure 4 or the 5100 performer shown in figure 5, and a monitoring monitor. blocking, such as the blocking monitor 4200 shown in figure 4 or the blocking monitor 5200 shown in figure 5. The autonomous vehicle 7100 may include one or more sensors, one or more operating environment monitors, or a combination thereof.
[0235] In some modalities, the autonomous vehicle operational management system can operate continuously or periodically, as in each temporal location in a sequence of temporal locations. For simplicity and clarity, the geospatial location of the autonomous vehicle 7100, the first pedestrian 7400, the second pedestrian 7410, the third pedestrian 7420 and the fourth pedestrian 7430 is shown according to a first time location, earlier sequentially, of the sequence of locations temporal. Although described with reference to a sequence of temporal locations for simplicity and clarity, each unit of the
Petition 870190075707, of 06/08/2019, p. 93/149
86/111 autonomous vehicle can operate at any frequency, the operation of the respective units can be synchronized or not synchronized, and operations can be performed concurrently with one or more parts of one or more temporal locations. For simplicity and clarity, the respective descriptions of one or more time locations, such as time locations between the time locations described in this document, may be omitted in this disclosure.
[0236] In one or more temporal locations, such as in each temporal location, the autonomous vehicle 7100 sensors can detect information corresponding to the operating environment of the autonomous vehicle 7100, as well as information corresponding to one or more of the 7400, 7410, 7420 pedestrians , 7430.
[0237] In one or more temporal locations, as in each temporal location, the autonomous vehicle operational management system can identify an expected 7500 path for the 7100 autonomous vehicle, a 7510 route for the 7100 autonomous vehicle, or both. According to the first time location, the expected path 7500 for autonomous vehicle 7100 indicates that autonomous vehicle 7100 can cross intersection 7210 when proceeding north along the first track 7200. Route 7510 for autonomous vehicle 7100 indicates that the autonomous vehicle 7100 can turn right onto the duplicate 7220.
[0238] In one or more temporal locations, such as in each temporal location, the operating environment monitors of the 7100 autonomous vehicle can identify or generate operational environment information representing an operating environment, or an aspect thereof, of the 7100 autonomous vehicle, such as in response to receiving sensor information corresponding to pedestrians 7400, 7410, 7420, which may include associating sensor information with pedestrians 7400, 7410, 7420, 7430, and may send operating environment information, which may include information representing pedestrians 7400, 7410, 7420, 7430, for the autonomous vehicle operational management controller.
Petition 870190075707, of 06/08/2019, p. 94/149
87/111 [0239] In one or more temporal locations, as in each temporal location, the blocking monitor can generate probability of availability information indicating respective availability probabilities for one or more areas or parts of the vehicle transport network. For example, according to the first time location, the blocking monitor can determine an expected 7520 path for the first 7400 pedestrian and an availability probability for an area or part of the vehicle transport network close to a convergence point between the expected path 7520 for the first pedestrian 7400 and the expected path 7500, or route 7510, for the autonomous vehicle 7100.
[0240] In another example, the blocking monitor can determine an expected 7530 path for the second pedestrian 7410, an expected 7540 path for the third pedestrian 7420, and an availability probability for an area or part of the transport network. vehicles near the first pedestrian crossing lane 7300. Identifying the availability probability for the area or part of the vehicle transport network near the first pedestrian crossing lane 7300 may include identifying the second pedestrian 7410 and the third pedestrian 7420 as objects preferably blocking and determine that the corresponding expected paths 7530, 7540 can overlap spatially and temporally.
[0241] In another example, the lock monitor can determine multiple expected paths for one or more external objects. For example, the blocking monitor can identify an expected first path 7530 for the second pedestrian 7410 with a high probability and can identify an expected second path 7532 for the second pedestrian 7410 with a low probability.
[0242] In another example, the blocking monitor can determine an expected 7550 path to the 7430 pedestrian room and an availability probability for an area or part of the nearby vehicle transport network
Petition 870190075707, of 06/08/2019, p. 95/149
88/111 of the second pedestrian crossing lane 7310.
[0243] In some modalities, generating the probability of availability information may include generating availability probabilities for a respective area or part of the vehicle transport network corresponding to multiple temporal locations in the sequence of temporal locations. The blocking monitor can produce the probability of availability information for the autonomous vehicle operational management controller or for access by it.
[0244] In one or more temporal locations, such as in each temporal location, the autonomous vehicle operational management controller can generate operating environment information, or update previously generated operating environment information, which may include receiving environment information or part of it.
[0245] In one or more temporal locations, as in each temporal location, the autonomous vehicle operational management controller can detect or identify one or more distinct vehicle operating scenarios, such as based on the operating environment represented by the environment information operational, which may include the operating environment information produced by the operating environment monitors, the probability of availability information produced by the blocking monitor, or a combination thereof. For example, according to the first time location, the autonomous vehicle operational management controller can detect or identify one or more of a first pedestrian scenario including the first pedestrian 7400, a second pedestrian scenario including the second pedestrian 7410, a third pedestrian scenario including the third pedestrian 7420 and a fourth pedestrian scenario including the fourth pedestrian 7430.
[0246] In one or more temporal locations, as in each location
Petition 870190075707, of 06/08/2019, p. 96/149
89/111, the autonomous vehicle operational management controller can detect one or more vehicle operating scenarios not previously detected. For example, according to the first time location the autonomous vehicle operational management controller can detect the first vehicle operating scenario, and according to a second time location of the sequence of time locations, such as a time location subsequent to the first location time, the autonomous vehicle operational management controller can detect the second vehicle operating scenario.
[0247] In one or more temporal locations, such as in each temporal location, the autonomous vehicle operational management controller can instantiate one or more instances of pedestrian-specific operational control assessment modules in response to detecting or identifying a or more than the first pedestrian scenario including the first pedestrian 7400, the second pedestrian scenario including the second pedestrian 7410, the third pedestrian scenario including the third pedestrian 7420, or the fourth pedestrian scenario including the fourth pedestrian 7430.
[0248] For example, according to the first time location, the autonomous vehicle operational management controller can detect the first pedestrian scenario including the first pedestrian 7400, can determine that a pedestrian scenario specific operational control evaluation module corresponding to the first pedestrian scenario is available, and can instantiate a first instance of a pedestrian scenario specific operational control assessment module in response to detecting the first pedestrian scenario including the first 7400 pedestrian.
[0249] In another example, the autonomous vehicle operational management controller can detect the first pedestrian scenario including the first 7400 pedestrian, determine that an operational control evaluation module
Petition 870190075707, of 06/08/2019, p. 97/149
90/111 pedestrian scenario specific corresponding to the first pedestrian scenario is unavailable, generate and resolve an operational control evaluation module specific to the pedestrian scenario corresponding to the first pedestrian scenario, and instantiate an instance of the operational control evaluation module pedestrian scenario specific corresponding to the first pedestrian scenario in response to detecting the first pedestrian scenario including the first 7400 pedestrian.
[0250] In some modalities, the autonomous vehicle operational management controller can detect or identify one or more of the pedestrian scenarios substantially concurrently. For example, the autonomous vehicle operational management controller can detect or identify the second pedestrian scenario including the second pedestrian 7410 and the third pedestrian scenario including the third pedestrian 7420 substantially concurrently.
[0251] In some modalities, the autonomous vehicle operational management controller can instantiate two or more respective instances of respective operational control assessment modules specific to pedestrian scenarios substantially concurrently. For example, the autonomous vehicle operational management controller can detect or identify the second pedestrian scenario including the second pedestrian 7410 and the third pedestrian scenario including the third pedestrian 7420 substantially concurrently, and can instantiate an instance of the evaluation module pedestrian scenario specific operational control corresponding to the second pedestrian scenario substantially concurrently with instantiating an instance of the pedestrian scenario specific operational control evaluation module corresponding to the third pedestrian scenario.
[0252] In another example, the autonomous vehicle operational management controller can detect or identify the second pedestrian scenario including
Petition 870190075707, of 06/08/2019, p. 98/149
91/111 the first expected path 7530 for the second pedestrian 7410 and a fifth pedestrian scenario including the second expected path 7532 for the second pedestrian 7410 substantially concurrently, and can instantiate an instance of a specific operational control assessment module from pedestrian scenario corresponding to the second pedestrian scenario substantially concurrently with instantiating an instance of a pedestrian scenario specific operational control assessment module corresponding to the fifth pedestrian scenario.
[0253] In one or more temporal locations, such as in each temporal location, the autonomous vehicle operational management controller may send, or otherwise make available, operating environment information, such as new or updated operating environment information, for instances of operational control assessment modules specific to scenarios previously instantiated, or operating.
[0254] Instantiating, or updating, a scenario-specific operational control assessment module instance may include providing the operating environment information, or a part of it, such as sensor information or availability probabilities, for the respective instances of scenario-specific operational control assessment modules, such as when sending the operating environment information, or a part of it, to the respective instances of scenario-specific operational control assessment modules, or store the operating environment information , or a part thereof, for access by the respective instances of scenario-specific operational control assessment modules.
[0255] In one or more temporal locations, as in each temporal location, the respective instances of pedestrian scenario specific operational control assessment modules may receive, or otherwise access, the
Petition 870190075707, of 06/08/2019, p. 99/149
92/111 operational environment information corresponding to the respective operational control scenarios for autonomous vehicles. For example, according to the first time location, the first instance of a pedestrian scenario-specific operational control assessment module may receive operational environment information corresponding to the first pedestrian scenario, which may include the probability of availability information for the area or part of the vehicle transport network close to the point of convergence between the expected path 7520 for the first pedestrian 7400 and the expected path 7500, or route 7510, for the autonomous vehicle 7100.
[0256] A pedestrian scenario specific operational control assessment module can model a pedestrian scenario as including states representing space-time locations for autonomous vehicle 7100, space-time locations for respective pedestrians 7400, 7410, 7420, 7430 , and corresponding blocking probabilities. A pedestrian scenario specific operational control assessment module can model a pedestrian scenario as including actions such as ‘stop’ (or ‘wait’), ‘move forward’ and ‘continue’. A pedestrian scenario specific operational control assessment module can model a pedestrian scenario as including state transition probabilities representing probabilities that a respective pedestrian enters an expected path of the autonomous vehicle, such as when crossing an expected path associated with the respective pedestrian. The state transition probabilities can be determined based on the operating environment information. A pedestrian scenario specific operational control assessment module can model a pedestrian scenario as including negative value rewards for violations of traffic control regulations, and including a positive value reward to complete the pedestrian scenario.
[0257] In one or more temporal locations, as in each location
Petition 870190075707, of 06/08/2019, p. 100/149
93/111 time, each instance of the instantaneous pedestrian scenario specific operational control assessment module can generate a respective candidate vehicle control action, such as 'stop', 'move forward' or 'continue', based on the respective scenario modeled and the corresponding operating environment information, and can produce the respective candidate vehicle control action for the autonomous vehicle operational management controller, such as when sending the respective candidate vehicle control action to the vehicle operational management controller autonomous vehicle or store the respective candidate vehicle control action for access by the autonomous vehicle operational management controller.
[0258] In one or more temporal locations, as in each temporal location, the autonomous vehicle operational management controller can receive candidate vehicle control actions from the respective instances of operational control evaluation modules specific to instanced pedestrian scenarios and can identify a vehicle control action based on the candidate vehicle control actions received to control autonomous vehicle 7100 at the corresponding time location and can control the autonomous vehicle to cross the vehicle transport network, or part of it, from according to the vehicle control action identified.
[0259] At one or more time locations, such as at each time location, the autonomous vehicle operational management controller can determine whether one or more of the detected vehicle operating scenarios has expired, and in response to determining that one of the operating scenarios of Expired vehicles may not instantiate instances of corresponding operational control assessment modules specific to pedestrian scenarios.
[0260] Figure 8 is a diagram of an example of an 8000 crossing scene including crossing scenario according to the modalities of this disclosure. Operational management of autonomous vehicle, such as management
Petition 870190075707, of 06/08/2019, p. 101/149
94/111 operating autonomous vehicle 5000 shown in figure 5, can include an autonomous vehicle 8100, such as vehicle 1000 shown in figure 1, one of the 2100/2110 vehicles shown in figure 2, a semi-autonomous vehicle, or any other vehicle implementing autonomous driving, operating an autonomous vehicle operational management system, such as the autonomous vehicle operational management system 4000 shown in figure 4, including an instance of crossing scenario specific operational control assessment module, which can be a instance of a crossing scenario specific operational control evaluation module, such as the crossing scenario specific operational control evaluation module shown in figure 4, which can be a model of an autonomous vehicle operational control scenario which includes the autonomous vehicle 8100 crossing a part of the vehicle transport network including an intersection. For simplicity and clarity, the part of the vehicle transport network corresponding to the 8000 intersection scene shown in figure 8 is oriented with north at the top and east at the right.
[0261] The part of the vehicle transport network corresponding to the 8000 intersection scene shown in figure 8 includes the autonomous vehicle 8100 traveling on a first lane 8200 from west to east, approaching an intersection 8210 with a second lane 8220. One expected path 8110 for autonomous vehicle 8100 indicates that autonomous vehicle 8100 can cross straight through intersection 8210. A first alternative expected path 8120 for autonomous vehicle 8100, shown using a dashed line, indicates that autonomous vehicle 8100 can cross intersection 8210 when turning right from the first lane 8200 to the second lane 8220. A second alternative expected path 8130 for the autonomous vehicle 8100, shown using a dashed line, indicates that the autonomous vehicle 8100 can cross intersection 8210 by turning to the left of first track 8200 to the second track 8220.
Petition 870190075707, of 06/08/2019, p. 102/149
95/111 [0262] A first vehicle 8300 is shown moving south along a first lane in a south direction on the second track 8220 approaching intersection 8210. A second vehicle 8310 is shown moving north to the along a first lane in the north direction of the second lane 8220 approaching the intersection 8210. A third vehicle away 8320 is shown moving north along a second lane in the north direction of the second lane 8220 approaching the intersection 8210. A fourth vehicle away 8330 is shown moving north along the first lane in the north direction of the second lane 8220 approaching intersection 8210.
[0263] The autonomous vehicle operational management system may include an autonomous vehicle operational management controller, such as the 4100 autonomous vehicle operational management controller shown in figure 4 or the 5100 performer shown in figure 5, and a monitoring monitor. blocking, such as the blocking monitor 4200 shown in figure 4 or the blocking monitor 5200 shown in figure 5. The autonomous vehicle 8100 may include one or more sensors, one or more operating environment monitors or a combination thereof.
[0264] In some modalities, the autonomous vehicle operational management system can operate continuously or periodically, as in each temporal location in a sequence of temporal locations. For simplicity and clarity, the geospatial location of the autonomous vehicle 8100, the first vehicle 8300, the second vehicle 8310, the third vehicle 8320 and the fourth vehicle 8330 is shown according to a first time location, earlier sequentially, the sequence of temporal locations. Although described with reference to a sequence of temporal locations for simplicity and clarity, each unit of the autonomous vehicle operational management system can operate at any frequency,
Petition 870190075707, of 06/08/2019, p. 103/149
96/111 the operation of the respective units can be synchronized or not synchronized, and operations can be performed concurrently with one or more parts of one or more temporal locations. For simplicity and clarity, the respective descriptions of one or more time locations, such as time locations between the time locations described in this document, may be omitted in this disclosure.
[0265] In one or more temporal locations, such as in each temporal location, the autonomous vehicle 8100 sensors can detect information corresponding to the operating environment of the autonomous vehicle 8100, such as information corresponding to one or more of the remote vehicles 8300, 8310, 8320, 8330.
[0266] In one or more temporal locations, as in each temporal location, the autonomous vehicle operational management system can identify an expected path 8110, 8120, 8130 for autonomous vehicle 8100, a route (not shown) for the vehicle standalone 8100, or both.
[0267] In one or more temporal locations, such as in each temporal location, the operating environment monitors of the 8100 autonomous vehicle can identify or generate operational environment information representing an operating environment, or an aspect thereof, of the 8100 autonomous vehicle, such as in response to receiving sensor information corresponding to vehicles away 8300, 8310, 8320, 8330, which may include associating sensor information with vehicles away 8300, 8310, 8320, 8330, and can produce operating environment information , which can include information representing vehicles away 8300, 8310, 8320, 8330, for the autonomous vehicle operational management controller.
[0268] In one or more temporal locations, as in each temporal location, the blocking monitor can generate probability of
Petition 870190075707, of 06/08/2019, p. 104/149
97/111 availability indicating respective availability probabilities for one or more areas or parts of the vehicle transport network. For example, the blocking monitor can determine one or more expected probable paths 8400, 8402 for the first vehicle away 8300, one or more expected expected paths 8410, 8412 for the second vehicle away 8310, one or more expected expected paths 8420, 8422 for the third vehicle away 8320, and an expected path 8430 for the fourth vehicle away 8330. The blocking monitor can generate probability of availability information indicating respective probabilities of availability for one or more areas or parts of the vehicle transport network corresponding to one or more of the expected 8110 path for autonomous vehicle 8100, the first alternate expected path 8120 for autonomous vehicle 8100, or the second alternative expected path 8130 for autonomous vehicle 8100.
[0269] In some modalities, generating the probability of availability information may include generating probabilities of availability for a respective area or part of the vehicle transport network corresponding to multiple temporal locations in the sequence of temporal locations. The blocking monitor can produce the probability of availability information for the autonomous vehicle operational management controller or for access by it.
[0270] In one or more temporal locations, such as in each temporal location, the autonomous vehicle operational management controller can generate operating environment information, or update previously generated operating environment information, which may include receiving environment information or part of it.
[0271] In one or more temporal locations, such as in each temporal location, the autonomous vehicle operational management controller can detect or identify one or more distinct vehicle operating scenarios, such as
Petition 870190075707, of 06/08/2019, p. 105/149
98/111 as based on the operating environment represented by the operating environment information, which may include the operating environment information produced by the operating environment monitors, the probability of availability information produced by the blocking monitor or a combination thereof. For example, the autonomous vehicle operational management controller can detect or identify one or more of a first crossing scenario including the first vehicle away 8300, a second crossing scenario including the second vehicle away 8310, a third crossing scenario including the third vehicle away 8320, and a fourth crossing scenario including the fourth vehicle away 8330.
[0272] In one or more temporal locations, as in each temporal location, the autonomous vehicle operational management controller can detect one or more operational scenarios of vehicles not previously detected. For example, according to a first time location the autonomous vehicle operational management controller can detect the first crossing scenario, and according to a second time location of the sequence of time locations, such as a time location subsequent to the first time location , the autonomous vehicle operational management controller can detect the second crossing scenario.
[0273] In one or more temporal locations, as in each temporal location, the autonomous vehicle operational management controller can instantiate one or more instances of crossing control specific operational control assessment modules in response to detecting or identifying a or more than the first crossing scenario, the second crossing scenario, the third crossing scenario or the fourth crossing scenario.
[0274] In some modalities, the autonomous vehicle operational management controller can detect or identify one or more of the
Petition 870190075707, of 06/08/2019, p. 106/149
99/111 crossing substantially concurrently. For example, the autonomous vehicle operational management controller can detect or identify the second crossing scenario and the third crossing scenario substantially concurrently.
[0275] In some modalities, the autonomous vehicle operational management controller can instantiate two or more respective instances of respective operational control assessment modules specific to crossing scenarios substantially concurrently. For example, the autonomous vehicle operational management controller can detect or identify the second crossing scenario and the third crossing scenario substantially concurrently, and can instantiate an instance of the crossing scenario specific operational control assessment module corresponding to the second crossing scenario substantially concurrently with instantiating an instance of the cross-border specific operational control assessment module corresponding to the third crossing scenario.
[0276] In another example, the autonomous vehicle operational management controller can detect or identify the second crossing scenario including the first expected path 8400 for the first vehicle away 8300 and a fifth crossing scenario including the second expected path 8402 for the first vehicle away 8300 substantially concurrently, and can instantiate an instance of an intersection scenario-specific operational control assessment module corresponding to the second intersection scenario substantially concurrently with instantiating an instance of an operational control assessment module specific crossing scenario corresponding to the fifth crossing scenario.
[0277] In one or more temporal locations, as in each temporal location, the autonomous vehicle operational management controller can
Petition 870190075707, of 06/08/2019, p. 107/149
100/111 send, or otherwise make available, operating environment information, such as new or updated operating environment information, to instances of scenario-specific operational control assessment modules that were previously instantiated, or operating.
[0278] Instantiating, or updating, a scenario-specific operational control assessment module instance may include providing the operational environment information, or a part of it, such as sensor information or availability probabilities, for the respective instances of scenario-specific operational control assessment modules, such as when sending the operating environment information, or a part of it, to the respective instances of scenario-specific operational control assessment modules, or store the operating environment information , or a part thereof, for access by the respective instances of scenario-specific operational control assessment modules.
[0279] In some embodiments, the operating environment information may indicate operational information for the autonomous vehicle 8100, such as geospatial location information, speed information, acceleration information, pending information, priority information or a combination thereof, and operational information for one or more of the remote vehicles 8300, 8310, 8320, 8330, such as geospatial location information, speed information, acceleration information, pending information, priority information or a combination thereof. The pending information can indicate a time period corresponding to the respective vehicle and a respective geographical location, such as a period of time when the respective vehicle was stationary at the intersection. The priority information can indicate a right-of-way priority corresponding to a respective vehicle in relation to other vehicles in the 8000 intersection scene.
Petition 870190075707, of 06/08/2019, p. 108/149
101/111 [0280] An intersection-specific operational control assessment module can model an intersection scenario as including states representing space-time locations for autonomous vehicle 8100, space-time locations for respective remote vehicles 8300, 8310 , 8320, 8330, pending information, priority information and corresponding blocking probabilities. A crossing scenario specific operational control assessment module can model a crossing scenario as including actions such as ‘stop’ (or ‘wait’), ‘move forward’ and ‘continue’. An intersection-specific operational control assessment module can model an intersection scenario as including probabilities of state transition representing probabilities that a respective intersection enters an expected path of the autonomous vehicle, such as when traveling on an associated expected path with the respective crossing. The state transition probabilities can be determined based on the operating environment information. A crossing scenario specific operational control assessment module can model a crossing scenario as including negative value rewards for violations of traffic control regulations, and including a positive value reward to complete the crossing scenario.
[0281] In one or more temporal locations, as in each temporal location, the respective instances of operational control assessment modules specific to crossing scenarios may receive, or otherwise access, the operational environment information corresponding to the respective scenarios crossing. For example, according to the first time location, the first instance of the crossing scenario specific operational control assessment module may receive information from the operational environment corresponding to the first crossing scenario, which may include the probability of availability information for the area or part of the vehicle transport network close to the
Petition 870190075707, of 06/08/2019, p. 109/149
102/111 point of convergence between the first expected path 8400 for the first vehicle away 8300 and the expected path 8110 for the autonomous vehicle 8100.
[0282] In one or more temporal locations, as in each temporal location, each instance of the instantaneous intersection scenario specific operational control module can generate a respective candidate vehicle control action, such as' stop ',' advance 'or' proceed ', based on the respective modeled scenario and the corresponding operating environment information, and can produce the respective candidate vehicle control action for the autonomous vehicle operational management controller, such as when submitting the respective control action candidate vehicle control for the autonomous vehicle operational management controller or store the respective candidate vehicle control action for access by the autonomous vehicle operational management controller.
[0283] In one or more temporal locations, such as in each temporal location, the autonomous vehicle operational management controller may receive candidate vehicle control actions from the respective instances of instantiated operational scenario assessment modules and can identify a vehicle control action based on the candidate vehicle control actions received to control autonomous vehicle 8100 at the corresponding time location, and can control autonomous vehicle 8100 to traverse the vehicle transport network, or a part of it, according to the identified vehicle control action.
[0284] At one or more time locations, such as at each time location, the autonomous vehicle operational management controller can determine whether one or more of the detected crossing scenarios has expired, and in response to determining that an crossing scenario has expired can do not instantiate instances of operational control assessment modules specific to corresponding crossing scenarios.
Petition 870190075707, of 06/08/2019, p. 110/149
103/111 [0285] Figure 9 is a diagram of an example of a 9000 lane change scene including a lane change scenario according to the modalities of this disclosure. Autonomous vehicle operational management, such as the autonomous vehicle operational management 5000 shown in figure 5, can include an autonomous vehicle 9100, such as vehicle 1000 shown in figure 1, one of the vehicles 2100, 2110 shown in figure 2, a vehicle semi-autonomous, or any other vehicle implementing autonomous driving, operating an autonomous vehicle operational management system, such as the autonomous vehicle operational management system 4000 shown in figure 4, including a scenario-specific operational control assessment module instance lane change, which may be an instance of a lane change scenario specific operational control assessment module, such as the 4430 lane change scenario specific operational control assessment module shown in figure 4, which can be a model of an autonomous vehicle operational control scenario q ue includes the autonomous vehicle 9100 traversing a part of the vehicle transport network when performing a lane change. For simplicity and clarity, the part of the vehicle transport network corresponding to the 9000 lane change scene shown in figure 9 is oriented with north at the top and east at the right.
[0286] The part of the vehicle transport network corresponding to the 9000 lane change scene shown in figure 9 includes the autonomous vehicle 9100 traveling north along a first lane 9200. The first lane 9200 includes a lane east bearing in the north direction 9210 and a west bearing range in the north direction 9220. An expected current path 9110 for the autonomous vehicle 9100 indicates that the autonomous vehicle 9100 is traveling north in the east bearing range in the north direction 9210. One path
Petition 870190075707, of 06/08/2019, p. 111/149
104/111 expected alternative 9120 for autonomous vehicle 9100, shown using a dashed line, indicates that autonomous vehicle 9100 can traverse the vehicle transport network when performing a lane change from east lane in north direction 9210 to the west bearing strip in the north direction 9220.
[0287] A first 9300 away vehicle is shown traveling north along the east lane in the north direction 9210 in front (north) of the 9100 autonomous vehicle. A second away vehicle 9400 is shown moving north along the west lane in north direction 9220 behind (south) autonomous vehicle 9100.
[0288] The autonomous vehicle operational management system may include an autonomous vehicle operational management controller, such as the 4100 autonomous vehicle operational management controller shown in figure 4 or the 5100 performer shown in figure 5, and a monitoring monitor. lock, such as lock monitor 4200 shown in figure 4 or lock monitor 5200 shown in figure 5. Autonomous vehicle 9100 may include one or more sensors, one or more operating environment monitors or a combination thereof.
[0289] In some modalities, the autonomous vehicle operational management system may operate continuously or periodically, as in each temporal location in a sequence of temporal locations. For simplicity and clarity, the geospatial location of the autonomous vehicle 9100, the first vehicle 9300 and the second vehicle 9400 is shown according to a first temporal location, earlier sequentially, of the sequence of temporal locations. Although described with reference to a sequence of time locations for simplicity and clarity, each unit of the autonomous vehicle operational management system can operate at any frequency, the operation of the respective units can be synchronized or unsynchronized, and operations can be performed concurrently with one or
Petition 870190075707, of 06/08/2019, p. 112/149
105/111 more parts of one or more temporal locations. For simplicity and clarity, the respective descriptions of one or more time locations, such as time locations between the time locations described in this document, may be omitted in this disclosure.
[0290] In one or more temporal locations, such as in each temporal location, autonomous vehicle 9100 sensors can detect information corresponding to the operating environment of autonomous vehicle 9100, such as information corresponding to one or more of the 9300, 9400 remote vehicles.
[0291] In one or more temporal locations, as in each temporal location, the autonomous vehicle operational management system can identify an expected path 9110, 9120 for autonomous vehicle 9100, a route (not shown) for autonomous vehicle 9100 , or both.
[0292] In one or more temporal locations, such as in each temporal location, the operating environment monitors of the 9100 autonomous vehicle can identify or generate operational environment information representing an operational environment, or an aspect thereof, of the 9100 autonomous vehicle, such as in response to receiving sensor information corresponding to vehicles 9300, 9400, which may include associating sensor information with vehicles 9300, 9400, and may send operating environment information, which may include information representing the 9300, 9400 remote vehicles for the autonomous vehicle operational management controller.
[0293] In one or more temporal locations, as in each temporal location, the blocking monitor can generate probability of availability information indicating respective availability probabilities for one or more areas or parts of the vehicle transport network. For example, the lockout monitor can determine one or more expected probable paths 9310, 9320 for the first vehicle away 9300, and one or more expected expected paths 9410,
Petition 870190075707, of 06/08/2019, p. 113/149
106/111
9420 for the second away vehicle 9400. The expected first probable path 9310 for the first away vehicle 9300 indicates that the first away vehicle 9300 crosses the corresponding part of the vehicle transport network on the east lane in the north direction 9210. The second path expected probable 9320, shown using a dashed line, for the first vehicle away 9300 indicates that the first vehicle away 9300 crosses the corresponding part of the vehicle transport network when performing a lane change to the west lane in the north direction 9220. The likely first expected path 9410 for the second away vehicle 9400 indicates that the second away vehicle 9400 traverses the corresponding part of the vehicle transport network in the west lane in the north direction 9220. The second likely expected path 9420, shown using a dashed line, for the second vehicle away 9400 indicates that the second vehicle to be fastado 9400 crosses the corresponding part of the vehicle transport network when performing a lane change to the east lane in the north direction 9210.
[0294] The blocking monitor can generate probability of availability information indicating respective probabilities of availability for one or more areas or parts of the vehicle transport network corresponding to one or more of the expected path 9110 for autonomous vehicle 9100 and the path expected alternative 9120 for autonomous vehicle 9100.
[0295] In some modalities, generating the probability of availability information may include generating probabilities of availability for a respective area or part of the vehicle transport network corresponding to multiple temporal locations in the sequence of temporal locations. The blocking monitor can produce the probability of availability information for the autonomous vehicle operational management controller or for access by it.
Petition 870190075707, of 06/08/2019, p. 114/149
107/111 [0296] In one or more temporal locations, such as in each temporal location, the autonomous vehicle operational management controller may generate operating environment information, or update previously generated operating environment information, which may include receiving the operational environment information or a part of it.
[0297] In one or more temporal locations, as in each temporal location, the autonomous vehicle operational management controller can detect or identify one or more distinct vehicle operating scenarios, such as based on the operating environment represented by the environment information operational, which may include the operating environment information produced by the operating environment monitors, the probability of availability information produced by the blocking monitor, or a combination thereof. For example, the autonomous vehicle operational management controller can detect or identify one or more of a first lane change scenario including the first 9300 away vehicle, a second lane change scenario including the second 9400 away vehicle , or both.
[0298] In one or more temporal locations, such as each temporal location, the autonomous vehicle operational management controller can instantiate one or more instances of operational control assessment modules specific to lane change scenarios in response to detect or identify one or more of the first lane change scenario or the second lane change scenario.
[0299] In one or more temporal locations, such as in each temporal location, the autonomous vehicle operational management controller may send, or otherwise make available, operating environment information, such as new or updated operating environment information, for instances of scenario-specific operational control assessment modules instantiated
Petition 870190075707, of 06/08/2019, p. 115/149
108/111 previously, or operating.
[0300] Instantiating, or updating, a scenario-specific operational control assessment module instance may include providing the operating environment information, or part of it, such as sensor information or availability probabilities, for the respective instances of scenario-specific operational control assessment modules, such as when sending the operating environment information, or a part of it, to the respective instances of scenario-specific operational control assessment modules, or store the operating environment information , or a part thereof, for access by the respective instances of scenario-specific operational control assessment modules.
[0301] In some embodiments, the operating environment information may indicate operational information for the 9100 autonomous vehicle, such as geospatial location information, speed information, acceleration information or a combination thereof, and operational information for one or more of the 9300, 9400 spaced vehicles, such as geospatial location information, speed information, acceleration information or a combination thereof.
[0302] A lane change scenario specific operational control assessment module can model a lane change scenario as including states representing space-time locations for the 9100 autonomous vehicle, space-time locations for the respective 9300, 9400 remote vehicles and corresponding blocking probabilities. A lane change scenario specific operational control assessment module can model a lane change scenario as including actions such as 'maintain', 'accelerate', 'decelerate' and 'proceed' (change lanes of change) bearing). A specific operational control assessment module for
Petition 870190075707, of 06/08/2019, p. 116/149
109/111 lane change scenario can model a lane change scenario as including state transition probabilities representing probabilities that a respective 9300, 9400 away vehicle enters an expected 9110, 9120 autonomous vehicle 9100 path For example, the first 9300 away vehicle can enter the alternative expected path 9120 of the autonomous vehicle 9100 by passing to the alternative expected path 9320 for the first away vehicle 9300 with a speed less than the speed of the 9100 autonomous vehicle. In another example , the second away vehicle 9400 can enter the alternate expected path 9120 of the autonomous vehicle 9100 by moving on the expected path 9410 to the second away vehicle 9400 with a speed greater than the speed of the autonomous vehicle 9100. The state transition probabilities can be determined based on operating environment information. A lane change scenario-specific operational control assessment module can model a lane change scenario as including negative value rewards for violations of traffic control regulations, and including a positive value reward to complete the lane change scenario.
[0303] In one or more temporal locations, as in each temporal location, the respective instances of operational control assessment modules specific to rolling lane change scenarios may receive, or otherwise access, the operating environment information corresponding to the respective scenarios of lane changes. For example, the second instance of a rolling track change scenario specific operational control assessment module may receive operating environment information corresponding to the second rolling track change scenario, which may include the probability of availability information for the area or part of the vehicle transport network close to the point of convergence between the path
Petition 870190075707, of 06/08/2019, p. 117/149
Expected 110/111 9410 for second vehicle away 9400 and alternative expected path 9120 for autonomous vehicle 9100.
[0304] In one or more temporal locations, as in each temporal location, each instance of the instantaneous operational control assessment module specific to the change of lane change instantiated can generate a respective candidate vehicle control action, such as' maintain ',' accelerate ',' decelerate 'or' proceed ', based on the respective modeled scenario and the corresponding operating environment information, and can produce the respective candidate vehicle control action for the autonomous vehicle operational management controller, such as when sending the respective candidate vehicle control action to the autonomous vehicle operational management controller or storing the respective candidate vehicle control action for access by the autonomous vehicle operational management controller.
[0305] In one or more temporal locations, as in each temporal location, the autonomous vehicle operational management controller can receive candidate vehicle control actions from the respective instances of operational control evaluation modules specific to lane change scenarios instantiation and can identify a vehicle control action based on the candidate vehicle control actions received to control the 9100 autonomous vehicle at the corresponding time location and can control the 9100 autonomous vehicle to traverse the vehicle transport network, or a part of it, according to the identified vehicle control action.
[0306] At one or more temporal locations, such as at each temporal location, the autonomous vehicle operational management controller can determine whether one or more of the detected lane change scenarios have expired, and in response to determine that a scenario lane change expired may not instantiate instances of
Petition 870190075707, of 06/08/2019, p. 118/149
111/111 operational control specific to changing bearing range scenarios.
[0307] The aspects, implementations and examples set out above have been described in order to allow easy understanding of the disclosure and are not limiting. On the contrary, the disclosure covers several modifications and equivalent arrangements included in the scope of the appended claims, the scope of which is to be interpreted more broadly to cover all such modifications and equivalent structure as permitted under the law.
权利要求:
Claims (20)
[1]
1. Method for use when crossing a vehicle transport network, the method CHARACTERIZED by the fact that it comprises:
crossing, by means of an autonomous vehicle, a vehicle transport network, in which crossing the vehicle transport network includes:
receiving, from an autonomous vehicle sensor, sensor information corresponding to an external object within a defined distance from the autonomous vehicle;
identify a distinct vehicle operating scenario in response to receiving sensor information;
instantiate a scenario-specific operational control assessment module instance, where the scenario-specific operational control assessment module instance is an instance of a scenario-specific operational control assessment module modeling the distinct vehicle operating scenario;
receive a candidate vehicle control action from the scenario-specific operational control evaluation module instance; and cross a part of the vehicle transport network based on the candidate vehicle control action.
[2]
2. Method, according to claim 1, CHARACTERIZED by the fact that crossing the part of the vehicle transport network based on the candidate vehicle control action includes determining whether it is to cross the part of the vehicle transport network according to with the candidate vehicle control action.
[3]
3. Method, according to claim 2, CHARACTERIZED by the fact that the candidate vehicle control action is one of stopping, moving forward or continuing.
[4]
4. Method, according to claim 3, CHARACTERIZED by the fact that it crosses the part of the vehicle transport network according to the action of
Petition 870190075707, of 06/08/2019, p. 120/149
2/9 candidate vehicle control includes:
in a condition where the candidate vehicle control action is to stop, control the autonomous vehicle to be stationary;
in a condition where the candidate vehicle control action is to advance, control the autonomous vehicle to cross a defined warning distance in the vehicle transport network at a defined warning rate;
in a condition where the candidate vehicle control action is to proceed, to control the autonomous vehicle to cross the vehicle transport network according to a vehicle control action identified earlier.
[5]
5. Method, according to claim 1, CHARACTERIZED by the fact that instantiating the scenario-specific operational control evaluation module instance includes:
identify a probability of convergence of spatio-temporal convergence between the external object and the autonomous vehicle; and instantiate the instance of a scenario-specific operational control assessment module under a condition where the likelihood of convergence exceeds a defined threshold.
[6]
6. Method, according to claim 1, CHARACTERIZED by the fact that crossing the vehicle transport network includes:
in response to crossing the part of the vehicle transport network based on the candidate vehicle control action:
to identify a second probability of convergence of space-time convergence between the external object and the autonomous vehicle;
in a condition where the second probability of convergence exceeds the defined threshold:
receive a second candidate vehicle control action from the scenario-specific operational control evaluation module instance; and
Petition 870190075707, of 06/08/2019, p. 121/149
3/9 cross the part of the vehicle transport network based on the candidate vehicle control action; and in a condition where the second probability of convergence is within the defined threshold:
do not instantiate the instance of the scenario-specific operational control assessment module.
[7]
7. Method, according to claim 1, CHARACTERIZED by the fact that crossing the vehicle transport network includes:
instantiate a second instance of a scenario-specific operational control assessment module; and receive a second vehicle control action candidate from the second instance of a scenario-specific operational control evaluation module.
[8]
8. Method, according to claim 7, CHARACTERIZED by the fact that identifying the distinct vehicle operating scenario includes identifying a second distinct vehicle operating scenario in response to receiving the sensor information, and in which the second instance of the Scenario-specific operational control assessment is an instance of a second scenario-specific operational control assessment module modeling the second distinct vehicle operating scenario.
[9]
9. Method, according to claim 7, CHARACTERIZED by the fact that crossing the vehicle transport network includes receiving, from an autonomous vehicle sensor, second sensor information corresponding to a second external object within the defined distance of the autonomous vehicle .
[10]
10. Method, according to claim 9, CHARACTERIZED by the fact that crossing the vehicle transport network includes:
identify the distinct vehicle operating scenario in response to receiving the second sensor information, in which the second instance of the evaluation module
Petition 870190075707, of 06/08/2019, p. 122/149
4/9 scenario-specific operational control is a second instance of the scenario-specific operational control evaluation module.
[11]
11. Method, according to claim 9, CHARACTERIZED by the fact that crossing the vehicle transport network includes:
identify a second distinct vehicle operating scenario in response to receiving the second sensor information, where the second instance of the scenario specific operational control evaluation module is an instance of a second scenario specific operational control evaluation module modeling the according to a different vehicle operating scenario.
[12]
12. Method, according to claim 7, CHARACTERIZED by the fact that crossing the part of the vehicle transport network includes crossing the part of the vehicle transport network based on the candidate vehicle control action and the second control action candidate vehicle.
[13]
13. Method, according to claim 12, CHARACTERIZED by the fact that crossing the part of the vehicle transport network includes:
in a condition in which the candidate vehicle control action differs from the second candidate vehicle control action, identifying one of the candidate vehicle control action or the second candidate vehicle control action as a chosen vehicle control action; and cross the part of the vehicle transport network according to the chosen vehicle control action.
[14]
14. Method, according to claim 1, CHARACTERIZED by the fact that instantiating the instance of a scenario-specific operational control evaluation module includes:
in a condition where identifying the distinct vehicle operating scenario includes identifying an intersection scenario, instantiating an instance of a specific crossing control operational control assessment module, in which the
Petition 870190075707, of 06/08/2019, p. 123/149
5/9 crossover scenario specific operational control evaluation module instance crossover scenario specific operational control evaluation module modeling the crossing scenario;
in a condition where identifying the distinct vehicle operating scenario includes identifying a pedestrian scenario, instantiating a pedestrian scenario specific operational control assessment module instance, where the scenario specific operating control assessment module instance pedestrian is an instance of a pedestrian scenario-specific operational control assessment module modeling the pedestrian scenario; and in a condition where identifying the distinct vehicle operating scenario includes identifying a lane change scenario, instantiating a specific operational control assessment module instance of lane change scenario, where the module instance lane-change scenario-specific operational control assessment is an instance of a lane-change scenario-specific operational control assessment module modeling the lane-change scenario.
[15]
15. Method for use when crossing a vehicle transport network, the method CHARACTERIZED by the fact that it comprises:
crossing, by means of an autonomous vehicle, a vehicle transport network, in which crossing the vehicle transport network includes:
generate an autonomous vehicle operational control environment to operate instances of scenario-specific operational control assessment modules, where each instance of scenario-specific operational control assessment module is an instance of a respective specific operational control assessment module of a plurality of scenario-specific operational control assessment modules, where each control assessment module
Petition 870190075707, of 06/08/2019, p. 124/149
6/9 scenario specific operational models a respective vehicle operating scenario distinct from a plurality of different vehicle operating scenarios, and in which each instance of scenario specific operational control evaluation module generates a respective responsive candidate vehicle control action the respective operating scenario of the corresponding distinct vehicle;
receiving, from at least one sensor of a plurality of sensors of the autonomous vehicle, sensor information corresponding to one or more external objects within a defined distance from the autonomous vehicle;
identify a first vehicle operating scenario distinct from the different vehicle operating scenarios in response to receiving sensor information;
instantiate a first instance of a scenario-specific operational control assessment module from the instances of scenario-specific operational control assessment modules based on a first external object from the one or more external objects, where the first instance of a scenario evaluation operational control module Scenario-specific operational control is an instance of a first scenario-specific operational control evaluation module of the plurality of scenario-specific operational control evaluation modules, the first scenario-specific operational control evaluation module modeling the first operational scenario of distinct vehicle;
receive a first vehicle control action candidate from the first instance of a scenario-specific operational control evaluation module; and cross a part of the vehicle transport network based on the first candidate vehicle control action.
[16]
16. Method, according to claim 15, CHARACTERIZED by the fact that instantiating the scenario-specific operational control evaluation module instance includes:
identify a convergence probability of space convergence
Petition 870190075707, of 06/08/2019, p. 125/149
7/9 time between the external object and the autonomous vehicle; and instantiate the instance of a scenario-specific operational control assessment module under a condition where the likelihood of convergence exceeds a defined threshold.
[17]
17. Method, according to claim 16, CHARACTERIZED by the fact that crossing the vehicle transport network includes:
in response to crossing the part of the vehicle transport network based on the candidate vehicle control action:
to identify a second probability of convergence of space-time convergence between the external object and the autonomous vehicle;
in a condition where the second probability of convergence exceeds the defined threshold:
receive a second candidate vehicle control action from the scenario-specific operational control evaluation module instance; and cross the part of the vehicle transport network based on the candidate vehicle control action; and in a condition where the second probability of convergence is within the defined threshold:
do not instantiate the instance of the scenario-specific operational control assessment module.
[18]
18. Method, according to claim 16, CHARACTERIZED by the fact that crossing the vehicle transport network includes:
instantiate a second instance of a scenario-specific operational control assessment module; and receive a second vehicle control action candidate from the second instance of a scenario-specific operational control evaluation module.
[19]
19. Autonomous vehicle, CHARACTERIZED by the fact that it comprises:
Petition 870190075707, of 06/08/2019, p. 126/149
8/9 a processor configured to execute instructions stored on computer readable non-transitory media for:
generate an autonomous vehicle operational control environment to operate instances of scenario-specific operational control assessment modules, where each instance of scenario-specific operational control assessment module is an instance of a respective specific operational control assessment module of a plurality of scenario-specific operational control assessment modules, where each scenario-specific operational control assessment module models a respective distinct vehicle operating scenario from a plurality of distinct vehicle operating scenarios, and in which each instance of a scenario-specific operational control evaluation module generates a respective candidate vehicle control action responsive to the respective corresponding distinct vehicle operating scenario;
receiving, from at least one sensor of a plurality of sensors of the autonomous vehicle, sensor information corresponding to one or more external objects within a defined distance from the autonomous vehicle;
identify a first vehicle operating scenario distinct from the different vehicle operating scenarios in response to receiving sensor information;
instantiate a first instance of a scenario-specific operational control assessment module from the instances of scenario-specific operational control assessment modules based on a first external object from the one or more external objects, where the first instance of a scenario evaluation operational control module Scenario-specific operational control is an instance of a first scenario-specific operational control evaluation module of the plurality of scenario-specific operational control evaluation modules, the first scenario-specific operational control evaluation module modeling the vehicle operational scenario distinct;
Petition 870190075707, of 06/08/2019, p. 127/149
9/9 receive a first vehicle control action candidate from the first instance of a scenario-specific operational control evaluation module; and controlling the autonomous vehicle to cross a part of the vehicle transport network based on the first candidate vehicle control action.
[20]
20. Autonomous vehicle, according to claim 19, CHARACTERIZED by the fact that the configured processor is for executing instructions stored on a computer-readable non-transitory medium for:
identify a probability of convergence of spatio-temporal convergence between the external object and the autonomous vehicle; and instantiate the instance of a scenario-specific operational control assessment module under a condition where the likelihood of convergence exceeds a defined threshold.
类似技术:
公开号 | 公开日 | 专利标题
BR112019016266A2|2020-04-07|autonomous vehicle operational management control
KR102199093B1|2021-01-06|Self-driving vehicle operation management, including operating a partially observable Markov decision process model instance
KR102305291B1|2021-09-29|Self-driving vehicle operation management
KR102090919B1|2020-05-18|Autonomous vehicle operation management interception monitoring
JP2021523057A|2021-09-02|Direction adjustment action for autonomous vehicle motion management
BR112020010209A2|2020-11-10|autonomous vehicle operational management controller
JP6963158B2|2021-11-05|Centralized shared autonomous vehicle operation management
US20210261123A1|2021-08-26|Autonomous Vehicle Operation with Explicit Occlusion Reasoning
同族专利:
公开号 | 公开日
RU2725920C1|2020-07-07|
CN110603497A|2019-12-20|
WO2018147872A1|2018-08-16|
US10654476B2|2020-05-19|
CN110603497B|2021-11-16|
KR102090920B1|2020-03-19|
CA3052952C|2021-06-01|
JP6969756B2|2021-11-24|
EP3580620A1|2019-12-18|
KR20190107169A|2019-09-18|
JP2020508509A|2020-03-19|
US20190329771A1|2019-10-31|
MX2019009392A|2019-11-28|
EP3580620A4|2020-04-22|
CA3052952A1|2018-08-16|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

GB8430980D0|1984-12-07|1985-01-16|Robinson M|Generation of apparently three-dimensional images|
JP2728170B2|1988-10-25|1998-03-18|マツダ株式会社|Travel control device for mobile vehicles|
US5646845A|1990-02-05|1997-07-08|Caterpillar Inc.|System and method for controlling an autonomously navigated vehicle|
US7418346B2|1997-10-22|2008-08-26|Intelligent Technologies International, Inc.|Collision avoidance methods and systems|
US20040068351A1|2002-04-22|2004-04-08|Neal Solomon|System, methods and apparatus for integrating behavior-based approach into hybrid control model for use with mobile robotic vehicles|
US7242294B2|2003-09-17|2007-07-10|Agilent Technologies, Inc|System and method for using mobile collectors for accessing a wireless sensor network|
DE102004058703B4|2004-09-27|2006-11-16|Daimlerchrysler Ag|Arrangement and method for determining an arrangement of sensors on a motor vehicle|
ES2329701T3|2005-03-14|2009-11-30|Mp S.R.L.|COMMUNICATION, MONITORING AND CONTROL DEVICE, AND RELATED METHOD, FOR RAILWAY TRAFFIC.|
JP4591346B2|2005-12-28|2010-12-01|アイシン・エィ・ダブリュ株式会社|Inter-vehicle communication system|
US9373149B2|2006-03-17|2016-06-21|Fatdoor, Inc.|Autonomous neighborhood vehicle commerce network and community|
JP4254844B2|2006-11-01|2009-04-15|トヨタ自動車株式会社|Travel control plan evaluation device|
US20090088916A1|2007-09-28|2009-04-02|Honeywell International Inc.|Method and system for automatic path planning and obstacle/collision avoidance of autonomous vehicles|
US8655822B2|2008-03-12|2014-02-18|Aptima, Inc.|Probabilistic decision making system and methods of use|
US8244469B2|2008-03-16|2012-08-14|Irobot Corporation|Collaborative engagement for target identification and tracking|
US8260515B2|2008-07-24|2012-09-04|GM Global Technology Operations LLC|Adaptive vehicle control system with driving style recognition|
US8126642B2|2008-10-24|2012-02-28|Gray & Company, Inc.|Control and systems for autonomously driven vehicles|
EP2467290B1|2009-08-19|2014-10-08|Kelsey-Hayes Company|Fail safe operational steering system for autonomous driving|
US8452535B2|2010-12-13|2013-05-28|GM Global Technology Operations LLC|Systems and methods for precise sub-lane vehicle positioning|
JP5672057B2|2011-02-23|2015-02-18|株式会社デンソー|Manufacturing method of glow plug control device|
US20120233102A1|2011-03-11|2012-09-13|Toyota Motor Engin. & Manufact. N.A.|Apparatus and algorithmic process for an adaptive navigation policy in partially observable environments|
US8849483B2|2011-04-13|2014-09-30|California Institute Of Technology|Target trailing with safe navigation with colregs for maritime autonomous surface vehicles|
US8949018B2|2011-06-13|2015-02-03|Toyota Jidosha Kabushiki Kaisha|Driving assistance device and driving assistance method|
GB2494716B|2011-09-15|2019-12-18|Bae Systems Plc|Autonomous vehicle and task modelling|
CN103875000B|2011-09-22|2016-04-06|阿索恩公司|For the monitoring of autonomous mobile robot, diagnosis and trace tool|
DE102012005245A1|2012-03-14|2012-09-20|Daimler Ag|Method for assisting driver of motor vehicle by overtaking process on oppositely drivable lane, involves detecting environment size described by environment of motor vehicle|
US20130278441A1|2012-04-24|2013-10-24|Zetta Research and Development, LLC - ForC Series|Vehicle proxying|
US8781669B1|2012-05-14|2014-07-15|Google Inc.|Consideration of risks in active sensing for an autonomous vehicle|
CN102903258B|2012-07-09|2017-10-27|汤斌淞|A kind of vehicle automatic navigation method, navigation pattern information preparation method and its apparatus for vehicle navigation|
US9440650B2|2012-08-08|2016-09-13|Toyota Jidosha Kabushiki Kaisha|Collision prediction apparatus|
US10246030B2|2012-08-09|2019-04-02|Toyota Jidosha Kabushiki Kaisha|Object detection apparatus and driving assistance apparatus|
DE102012220134A1|2012-11-06|2014-05-08|Robert Bosch Gmbh|Method for detecting deliberate deviation from optimum travel route of vehicle between start point and target point, involves determining deviation from optimal travel route of vehicle, if probability satisfies predetermined criterion|
US9242647B2|2013-02-06|2016-01-26|GM Global Technology Operations LLC|Display systems and methods for autonomous vehicles|
US10347127B2|2013-02-21|2019-07-09|Waymo Llc|Driving mode adjustment|
US20140309930A1|2013-04-15|2014-10-16|Flextronics Ap, Llc|Automatic camera image retrieval based on route traffic and conditions|
DE102013206746B4|2013-04-16|2016-08-11|Ford Global Technologies, Llc|Method and device for modifying the configuration of a driver assistance system of a motor vehicle|
WO2015034923A2|2013-09-03|2015-03-12|Metrom Rail, Llc|Rail vehicle signal enforcement and separation control|
US9099004B2|2013-09-12|2015-08-04|Robert Bosch Gmbh|Object differentiation warning system|
US9718473B2|2013-10-11|2017-08-01|Nissan Motor Co., Ltd.|Travel control device and travel control method|
US20150106010A1|2013-10-15|2015-04-16|Ford Global Technologies, Llc|Aerial data for vehicle navigation|
EP3751459A1|2013-12-04|2020-12-16|Mobileye Vision Technologies Ltd.|Adjusting velocity of a vehicle for a curve|
US9889847B2|2013-12-24|2018-02-13|Volvo Truck Corporation|Method and system for driver assistance for a vehicle|
US9140554B2|2014-01-24|2015-09-22|Microsoft Technology Licensing, Llc|Audio navigation assistance|
EP2915718B1|2014-03-04|2018-07-11|Volvo Car Corporation|Apparatus and method for continuously establishing a boundary for autonomous driving availability and an automotive vehicle comprising such an apparatus|
JP6537780B2|2014-04-09|2019-07-03|日立オートモティブシステムズ株式会社|Traveling control device, in-vehicle display device, and traveling control system|
US9404761B2|2014-05-30|2016-08-02|Nissan North America, Inc.|Autonomous vehicle lane routing and navigation|
US20150345967A1|2014-06-03|2015-12-03|Nissan North America, Inc.|Probabilistic autonomous vehicle routing and navigation|
JP6451111B2|2014-07-10|2019-01-16|日産自動車株式会社|Driving support device and driving support method|
US10293816B2|2014-09-10|2019-05-21|Ford Global Technologies, Llc|Automatic park and reminder system and method of use|
JP6280850B2|2014-09-29|2018-02-14|日立建機株式会社|Obstacle avoidance system|
KR101664582B1|2014-11-12|2016-10-10|현대자동차주식회사|Path Planning Apparatus and Method for Autonomous Vehicle|
US9534910B2|2014-12-09|2017-01-03|Toyota Motor Engineering & Manufacturing North America, Inc.|Autonomous vehicle detection of and response to yield scenarios|
US9963215B2|2014-12-15|2018-05-08|Leidos, Inc.|System and method for fusion of sensor data to support autonomous maritime vessels|
US10216196B2|2015-02-01|2019-02-26|Prosper Technology, Llc|Methods to operate autonomous vehicles to pilot vehicles in groups or convoys|
DE102015201878A1|2015-02-04|2016-08-04|Continental Teves Ag & Co. Ohg|Semi-automated lane change|
KR20200127218A|2015-02-10|2020-11-10|모빌아이 비젼 테크놀로지스 엘티디.|Sparse map for autonomous vehicle navigation|
US20160260328A1|2015-03-06|2016-09-08|Qualcomm Incorporated|Real-time Occupancy Mapping System for Autonomous Vehicles|
US9630498B2|2015-06-24|2017-04-25|Nissan North America, Inc.|Vehicle operation assistance information management|
US10086699B2|2015-06-24|2018-10-02|Nissan North America, Inc.|Vehicle operation assistance information management for autonomous vehicle control operation|
EP3327693A4|2015-07-21|2018-07-18|Nissan Motor Co., Ltd.|Scene evaluation device, travel support device, and scene evaluation method|
US9934688B2|2015-07-31|2018-04-03|Ford Global Technologies, Llc|Vehicle trajectory determination|
US10002471B2|2015-09-30|2018-06-19|Ants Technology Limited|Systems and methods for autonomous vehicle navigation|
US9904286B2|2015-10-13|2018-02-27|Nokia Technologies Oy|Method and apparatus for providing adaptive transitioning between operational modes of an autonomous vehicle|
US9913104B2|2016-01-21|2018-03-06|General Motors Llc|Vehicle location services|
US9568915B1|2016-02-11|2017-02-14|Mitsubishi Electric Research Laboratories, Inc.|System and method for controlling autonomous or semi-autonomous vehicle|
DE102016203086B4|2016-02-26|2018-06-28|Robert Bosch Gmbh|Method and device for driver assistance|
DE102016203723A1|2016-03-08|2017-09-14|Robert Bosch Gmbh|Method and system for determining the pose of a vehicle|
US9645577B1|2016-03-23|2017-05-09|nuTonomy Inc.|Facilitating vehicle driving and self-driving|
CN105741595B|2016-04-27|2018-02-27|常州加美科技有限公司|A kind of automatic driving vehicle navigation travelling-crane method based on cloud database|
CN110603497B|2017-02-10|2021-11-16|日产北美公司|Autonomous vehicle and method of autonomous vehicle operation management control|CA3052953C|2017-02-10|2021-11-09|Nissan North America, Inc.|Autonomous vehicle operational management blocking monitoring|
CN110603497B|2017-02-10|2021-11-16|日产北美公司|Autonomous vehicle and method of autonomous vehicle operation management control|
JP6913353B2|2017-05-26|2021-08-04|株式会社データ変換研究所|Mobile control system|
WO2018230461A1|2017-06-16|2018-12-20|本田技研工業株式会社|Vehicle control system, vehicle control method and program|
US10860019B2|2017-09-08|2020-12-08|Motional Ad Llc|Planning autonomous motion|
US10836405B2|2017-10-30|2020-11-17|Nissan North America, Inc.|Continual planning and metareasoning for controlling an autonomous vehicle|
WO2019088989A1|2017-10-31|2019-05-09|Nissan North America, Inc.|Reinforcement and model learning for vehicle operation|
EP3717324A4|2017-11-30|2021-06-02|Nissan North America, Inc.|Autonomous vehicle operational management scenarios|
JP6979366B2|2018-02-07|2021-12-15|本田技研工業株式会社|Vehicle control devices, vehicle control methods, and programs|
JP6963158B2|2018-02-26|2021-11-05|ニッサン ノース アメリカ,インク|Centralized shared autonomous vehicle operation management|
US11086318B1|2018-03-21|2021-08-10|Uatc, Llc|Systems and methods for a scenario tagger for autonomous vehicles|
US11120688B2|2018-06-29|2021-09-14|Nissan North America, Inc.|Orientation-adjust actions for autonomous vehicle operational management|
US11260855B2|2018-07-17|2022-03-01|Baidu Usa Llc|Methods and systems to predict object movement for autonomous driving vehicles|
US10649453B1|2018-11-15|2020-05-12|Nissan North America, Inc.|Introspective autonomous vehicle operational management|
CN111338333B|2018-12-18|2021-08-31|北京航迹科技有限公司|System and method for autonomous driving|
US10955853B2|2018-12-18|2021-03-23|Beijing Voyager Technology Co., Ltd.|Systems and methods for autonomous driving|
CN109739246A|2019-02-19|2019-05-10|百度在线网络技术(北京)有限公司|Decision-making technique, device, equipment and storage medium during a kind of changing Lane|
CN109949596A|2019-02-28|2019-06-28|北京百度网讯科技有限公司|Vehicle exchange method and device for automatic driving vehicle|
US11188094B2|2019-04-30|2021-11-30|At&T Intellectual Property I, L.P.|Autonomous vehicle signaling system|
JP2021041757A|2019-09-09|2021-03-18|本田技研工業株式会社|Vehicle control device, and vehicle control method and program|
CN111081045A|2019-12-31|2020-04-28|智车优行科技(上海)有限公司|Attitude trajectory prediction method and electronic equipment|
US20210291865A1|2020-03-19|2021-09-23|Cartica Ai Ltd.|Predictive turning assistant|
US20220048513A1|2020-08-12|2022-02-17|Honda Motor Co., Ltd.|Probabilistic-based lane-change decision making and motion planning system and method thereof|
CN112373472B|2021-01-14|2021-04-20|长沙理工大学|Method for controlling vehicle entering time and running track at automatic driving intersection|
法律状态:
2021-10-19| B350| Update of information on the portal [chapter 15.35 patent gazette]|
优先权:
申请号 | 申请日 | 专利标题
PCT/US2017/017502|WO2018147872A1|2017-02-10|2017-02-10|Autonomous vehicle operational management control|
[返回顶部]